Apr 21 15:10:33.984027 ip-10-0-143-120 systemd[1]: Starting Kubernetes Kubelet... Apr 21 15:10:34.505303 ip-10-0-143-120 kubenswrapper[2583]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:10:34.505303 ip-10-0-143-120 kubenswrapper[2583]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 15:10:34.505303 ip-10-0-143-120 kubenswrapper[2583]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:10:34.505303 ip-10-0-143-120 kubenswrapper[2583]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 15:10:34.505303 ip-10-0-143-120 kubenswrapper[2583]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:10:34.507227 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.507008 2583 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 15:10:34.515424 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515399 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:10:34.515424 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515418 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:10:34.515424 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515423 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:10:34.515424 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515427 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:10:34.515424 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515430 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:10:34.515424 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515433 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515436 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515439 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515441 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515444 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515448 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515452 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515456 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515460 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515463 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515466 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515469 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515472 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515475 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515478 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515481 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515483 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515486 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515494 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:10:34.515645 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515497 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515499 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515502 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515505 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515507 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515510 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515513 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515516 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515518 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515521 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515523 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515526 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515528 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515531 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515534 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515537 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515540 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515543 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515545 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515548 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:10:34.516123 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515551 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515554 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515557 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515560 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515563 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515566 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515568 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515571 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515574 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515577 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515580 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515582 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515585 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515587 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515590 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515593 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515595 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515597 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515600 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515603 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:10:34.516644 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515605 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515608 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515611 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515613 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515616 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515618 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515621 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515625 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515627 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515630 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515633 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515635 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515639 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515642 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515645 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515647 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515651 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515653 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515656 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515659 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:10:34.517160 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515662 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.515664 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516068 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516074 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516077 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516079 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516082 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516085 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516087 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516090 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516093 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516095 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516098 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516100 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516103 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516106 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516109 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516112 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516115 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516118 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:10:34.517639 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516121 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516124 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516126 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516129 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516132 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516135 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516138 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516141 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516143 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516146 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516148 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516151 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516153 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516156 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516158 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516161 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516163 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516166 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516168 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516171 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:10:34.518178 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516174 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516176 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516179 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516181 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516183 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516186 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516188 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516191 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516193 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516196 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516199 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516202 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516207 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516210 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516214 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516216 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516219 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516221 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516224 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516227 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:10:34.518702 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516230 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516232 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516236 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516240 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516244 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516249 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516253 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516257 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516261 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516264 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516267 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516269 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516273 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516276 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516279 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516282 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516285 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516288 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516290 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:10:34.519251 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516293 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516296 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516298 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516301 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516304 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516307 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516310 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516312 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.516315 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517765 2583 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517789 2583 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517796 2583 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517801 2583 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517809 2583 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517813 2583 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517818 2583 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517823 2583 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517827 2583 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517830 2583 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517833 2583 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517836 2583 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517839 2583 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 15:10:34.519713 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517843 2583 flags.go:64] FLAG: --cgroup-root="" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517846 2583 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517849 2583 flags.go:64] FLAG: --client-ca-file="" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517852 2583 flags.go:64] FLAG: --cloud-config="" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517854 2583 flags.go:64] FLAG: --cloud-provider="external" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517857 2583 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517863 2583 flags.go:64] FLAG: --cluster-domain="" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517866 2583 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517870 2583 flags.go:64] FLAG: --config-dir="" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517872 2583 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517876 2583 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517880 2583 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517883 2583 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517886 2583 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517889 2583 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517893 2583 flags.go:64] FLAG: --contention-profiling="false" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517896 2583 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517899 2583 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517902 2583 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517905 2583 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517909 2583 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517913 2583 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517916 2583 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517918 2583 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517922 2583 flags.go:64] FLAG: --enable-server="true" Apr 21 15:10:34.520263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517925 2583 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517929 2583 flags.go:64] FLAG: --event-burst="100" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517933 2583 flags.go:64] FLAG: --event-qps="50" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517936 2583 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517939 2583 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517942 2583 flags.go:64] FLAG: --eviction-hard="" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517946 2583 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517949 2583 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517952 2583 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517955 2583 flags.go:64] FLAG: --eviction-soft="" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517958 2583 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517961 2583 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517964 2583 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517967 2583 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517970 2583 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517973 2583 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517976 2583 flags.go:64] FLAG: --feature-gates="" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517980 2583 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517983 2583 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517986 2583 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517989 2583 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517993 2583 flags.go:64] FLAG: --healthz-port="10248" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517996 2583 flags.go:64] FLAG: --help="false" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.517999 2583 flags.go:64] FLAG: --hostname-override="ip-10-0-143-120.ec2.internal" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518002 2583 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 15:10:34.520913 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518005 2583 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518008 2583 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518011 2583 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518015 2583 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518018 2583 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518020 2583 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518023 2583 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518027 2583 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518030 2583 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518033 2583 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518036 2583 flags.go:64] FLAG: --kube-reserved="" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518039 2583 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518042 2583 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518045 2583 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518048 2583 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518051 2583 flags.go:64] FLAG: --lock-file="" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518053 2583 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518056 2583 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518060 2583 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518065 2583 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518068 2583 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518071 2583 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518074 2583 flags.go:64] FLAG: --logging-format="text" Apr 21 15:10:34.521520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518077 2583 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518080 2583 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518083 2583 flags.go:64] FLAG: --manifest-url="" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518086 2583 flags.go:64] FLAG: --manifest-url-header="" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518093 2583 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518096 2583 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518100 2583 flags.go:64] FLAG: --max-pods="110" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518103 2583 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518106 2583 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518109 2583 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518112 2583 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518115 2583 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518118 2583 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518121 2583 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518129 2583 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518132 2583 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518135 2583 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518138 2583 flags.go:64] FLAG: --pod-cidr="" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518142 2583 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518147 2583 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518150 2583 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518154 2583 flags.go:64] FLAG: --pods-per-core="0" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518157 2583 flags.go:64] FLAG: --port="10250" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518160 2583 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 15:10:34.522093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518163 2583 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00acf259f51593781" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518166 2583 flags.go:64] FLAG: --qos-reserved="" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518169 2583 flags.go:64] FLAG: --read-only-port="10255" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518172 2583 flags.go:64] FLAG: --register-node="true" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518175 2583 flags.go:64] FLAG: --register-schedulable="true" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518178 2583 flags.go:64] FLAG: --register-with-taints="" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518182 2583 flags.go:64] FLAG: --registry-burst="10" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518185 2583 flags.go:64] FLAG: --registry-qps="5" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518188 2583 flags.go:64] FLAG: --reserved-cpus="" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518192 2583 flags.go:64] FLAG: --reserved-memory="" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518195 2583 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518199 2583 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518202 2583 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518206 2583 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518209 2583 flags.go:64] FLAG: --runonce="false" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518212 2583 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518215 2583 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518218 2583 flags.go:64] FLAG: --seccomp-default="false" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518221 2583 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518224 2583 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518227 2583 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518230 2583 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518234 2583 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518237 2583 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518239 2583 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518242 2583 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 15:10:34.522675 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518246 2583 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518249 2583 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518252 2583 flags.go:64] FLAG: --system-cgroups="" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518255 2583 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518260 2583 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518263 2583 flags.go:64] FLAG: --tls-cert-file="" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518266 2583 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518270 2583 flags.go:64] FLAG: --tls-min-version="" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518273 2583 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518276 2583 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518279 2583 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518282 2583 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518285 2583 flags.go:64] FLAG: --v="2" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518290 2583 flags.go:64] FLAG: --version="false" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518294 2583 flags.go:64] FLAG: --vmodule="" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518298 2583 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518302 2583 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518400 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518404 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518409 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518412 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518415 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518417 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518420 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:10:34.523300 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518423 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518426 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518428 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518431 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518433 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518436 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518439 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518441 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518444 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518446 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518449 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518452 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518454 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518457 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518460 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518462 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518464 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518467 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518469 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518472 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:10:34.523895 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518474 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518477 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518479 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518482 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518484 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518489 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518491 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518495 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518498 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518501 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518503 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518506 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518509 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518511 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518514 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518517 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518519 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518522 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518524 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:10:34.524439 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518527 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518530 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518532 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518535 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518538 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518540 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518543 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518545 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518548 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518550 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518554 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518557 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518561 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518564 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518566 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518569 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518572 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518574 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518578 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518580 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:10:34.524932 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518584 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518587 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518591 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518595 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518597 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518600 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518602 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518605 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518608 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518611 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518613 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518616 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518618 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518621 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518624 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518627 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518630 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518632 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518635 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:10:34.525419 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.518637 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:10:34.525904 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.518643 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:10:34.525904 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.525408 2583 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 15:10:34.525904 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.525428 2583 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 15:10:34.525904 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525481 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:10:34.525904 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525486 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:10:34.525904 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525489 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:10:34.525904 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525493 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:10:34.525904 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525496 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:10:34.525904 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525499 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:10:34.525904 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525502 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:10:34.525904 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525505 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:10:34.525904 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525507 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:10:34.525904 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525510 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:10:34.525904 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525513 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:10:34.525904 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525516 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525518 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525521 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525524 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525526 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525529 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525532 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525534 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525537 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525539 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525542 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525545 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525547 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525550 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525556 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525559 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525562 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525564 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525567 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525570 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:10:34.526285 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525573 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525577 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525581 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525584 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525587 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525591 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525593 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525596 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525599 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525602 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525605 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525607 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525610 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525612 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525615 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525618 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525620 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525623 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525626 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:10:34.526788 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525629 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525633 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525636 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525638 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525642 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525645 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525647 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525650 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525653 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525656 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525659 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525661 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525664 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525667 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525670 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525672 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525675 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525677 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525679 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525682 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:10:34.527277 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525684 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525687 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525689 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525692 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525694 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525697 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525699 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525702 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525704 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525706 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525709 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525712 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525715 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525717 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525719 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525722 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:10:34.527757 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.525728 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525848 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525853 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525857 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525860 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525864 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525866 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525871 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525874 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525878 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525881 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525884 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525887 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525889 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525892 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525895 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525897 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525900 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525902 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525905 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:10:34.528218 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525907 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525910 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525913 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525916 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525919 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525922 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525925 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525927 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525930 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525932 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525934 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525937 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525940 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525942 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525945 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525949 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525953 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525956 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525960 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:10:34.528726 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525962 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525965 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525968 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525972 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525975 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525977 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525980 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525983 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525986 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525989 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525991 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525993 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525996 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.525998 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526001 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526003 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526006 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526008 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526011 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526013 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:10:34.529217 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526016 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526018 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526021 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526023 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526026 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526029 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526031 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526034 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526053 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526057 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526060 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526063 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526066 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526069 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526073 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526076 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526078 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526082 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526085 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526087 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:10:34.529711 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526090 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:10:34.530259 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526093 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:10:34.530259 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526096 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:10:34.530259 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526099 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:10:34.530259 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526102 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:10:34.530259 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526104 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:10:34.530259 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526107 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:10:34.530259 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:34.526110 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:10:34.530259 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.526114 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:10:34.530259 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.526834 2583 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 15:10:34.530849 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.530835 2583 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 15:10:34.532025 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.532013 2583 server.go:1019] "Starting client certificate rotation" Apr 21 15:10:34.532132 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.532114 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:10:34.532170 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.532161 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:10:34.562331 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.562313 2583 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:10:34.567018 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.567003 2583 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:10:34.586059 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.586037 2583 log.go:25] "Validated CRI v1 runtime API" Apr 21 15:10:34.592254 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.592236 2583 log.go:25] "Validated CRI v1 image API" Apr 21 15:10:34.594197 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.594177 2583 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 15:10:34.599294 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.599269 2583 fs.go:135] Filesystem UUIDs: map[2c750a31-ad39-4a6a-9e91-90fd3a201ac2:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 9d585f63-256c-4e0e-a8f3-68854e88c582:/dev/nvme0n1p3] Apr 21 15:10:34.599358 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.599293 2583 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 15:10:34.602506 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.602481 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:10:34.605395 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.605279 2583 manager.go:217] Machine: {Timestamp:2026-04-21 15:10:34.603057815 +0000 UTC m=+0.482829624 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:2499996 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24c5385755ebf4de2e30d350a790c9 SystemUUID:ec24c538-5755-ebf4-de2e-30d350a790c9 BootID:4ae5baf2-dd6b-42d9-86cf-12b96d504d09 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a0:96:95:e4:03 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a0:96:95:e4:03 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3a:17:d7:45:05:41 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 15:10:34.605395 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.605386 2583 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 15:10:34.605523 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.605502 2583 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 15:10:34.605841 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.605819 2583 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 15:10:34.606023 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.605842 2583 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-120.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 15:10:34.606100 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.606037 2583 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 15:10:34.606100 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.606049 2583 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 15:10:34.606100 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.606067 2583 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:10:34.607095 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.607082 2583 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:10:34.608624 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.608612 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:10:34.608756 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.608738 2583 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 15:10:34.612373 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.612361 2583 kubelet.go:491] "Attempting to sync node with API server" Apr 21 15:10:34.612439 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.612379 2583 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 15:10:34.612439 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.612395 2583 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 15:10:34.612439 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.612410 2583 kubelet.go:397] "Adding apiserver pod source" Apr 21 15:10:34.612439 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.612422 2583 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 15:10:34.613713 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.613700 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:10:34.613806 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.613725 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:10:34.617044 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.617028 2583 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 15:10:34.619050 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.619031 2583 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 15:10:34.620552 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.620537 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 15:10:34.620638 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.620563 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 15:10:34.620638 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.620576 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 15:10:34.620638 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.620603 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 15:10:34.620638 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.620612 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 15:10:34.620638 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.620620 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 15:10:34.620638 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.620629 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 15:10:34.620902 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.620660 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 15:10:34.620902 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.620673 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 15:10:34.620902 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.620684 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 15:10:34.620902 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.620704 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 15:10:34.620902 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.620757 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 15:10:34.621902 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.621891 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 15:10:34.621951 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.621904 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 15:10:34.622332 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:34.622311 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-120.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 15:10:34.622417 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:34.622361 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 15:10:34.624703 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.624685 2583 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-120.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 15:10:34.625972 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.625957 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w5v5x" Apr 21 15:10:34.626309 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.626297 2583 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 15:10:34.626359 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.626351 2583 server.go:1295] "Started kubelet" Apr 21 15:10:34.626458 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.626426 2583 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 15:10:34.626543 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.626449 2583 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 15:10:34.626543 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.626515 2583 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 15:10:34.627219 ip-10-0-143-120 systemd[1]: Started Kubernetes Kubelet. Apr 21 15:10:34.630461 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.630439 2583 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 15:10:34.632136 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.632113 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w5v5x" Apr 21 15:10:34.634793 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.634766 2583 server.go:317] "Adding debug handlers to kubelet server" Apr 21 15:10:34.640354 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.640338 2583 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 15:10:34.640354 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.640351 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 15:10:34.641259 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:34.641235 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-120.ec2.internal\" not found" Apr 21 15:10:34.641527 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.641515 2583 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 15:10:34.641613 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.641586 2583 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 15:10:34.641809 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.641797 2583 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 15:10:34.641907 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:34.641881 2583 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 15:10:34.641907 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.641822 2583 factory.go:153] Registering CRI-O factory Apr 21 15:10:34.642055 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.641924 2583 factory.go:223] Registration of the crio container factory successfully Apr 21 15:10:34.642055 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.642000 2583 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 15:10:34.642055 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.642052 2583 reconstruct.go:97] "Volume reconstruction finished" Apr 21 15:10:34.642191 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.642062 2583 reconciler.go:26] "Reconciler: start to sync state" Apr 21 15:10:34.642191 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.642012 2583 factory.go:55] Registering systemd factory Apr 21 15:10:34.642191 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.642083 2583 factory.go:223] Registration of the systemd container factory successfully Apr 21 15:10:34.642191 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.642102 2583 factory.go:103] Registering Raw factory Apr 21 15:10:34.642191 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.642113 2583 manager.go:1196] Started watching for new ooms in manager Apr 21 15:10:34.642489 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.642474 2583 manager.go:319] Starting recovery of all containers Apr 21 15:10:34.644398 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.644375 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:34.649164 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:34.649146 2583 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-143-120.ec2.internal\" not found" node="ip-10-0-143-120.ec2.internal" Apr 21 15:10:34.652293 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.652281 2583 manager.go:324] Recovery completed Apr 21 15:10:34.656005 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.655993 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:34.658596 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.658581 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:34.658654 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.658608 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:34.658654 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.658619 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:34.659118 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.659103 2583 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 15:10:34.659118 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.659117 2583 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 15:10:34.659215 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.659132 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:10:34.661682 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.661671 2583 policy_none.go:49] "None policy: Start" Apr 21 15:10:34.661727 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.661687 2583 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 15:10:34.661727 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.661697 2583 state_mem.go:35] "Initializing new in-memory state store" Apr 21 15:10:34.695855 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.695842 2583 manager.go:341] "Starting Device Plugin manager" Apr 21 15:10:34.695934 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:34.695871 2583 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 15:10:34.695934 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.695881 2583 server.go:85] "Starting device plugin registration server" Apr 21 15:10:34.696096 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.696081 2583 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 15:10:34.696190 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.696097 2583 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 15:10:34.696241 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.696208 2583 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 15:10:34.696281 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.696273 2583 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 15:10:34.696281 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.696280 2583 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 15:10:34.696891 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:34.696871 2583 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 15:10:34.696967 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:34.696914 2583 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-120.ec2.internal\" not found" Apr 21 15:10:34.742687 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.742651 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 15:10:34.743845 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.743829 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 15:10:34.743906 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.743859 2583 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 15:10:34.743906 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.743878 2583 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 15:10:34.743906 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.743887 2583 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 15:10:34.744028 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:34.743965 2583 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 15:10:34.747425 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.747409 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:34.796816 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.796755 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:34.797580 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.797567 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:34.797643 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.797596 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:34.797643 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.797612 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:34.797643 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.797637 2583 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-120.ec2.internal" Apr 21 15:10:34.810631 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.810615 2583 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-120.ec2.internal" Apr 21 15:10:34.810677 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:34.810636 2583 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-120.ec2.internal\": node \"ip-10-0-143-120.ec2.internal\" not found" Apr 21 15:10:34.827987 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:34.827963 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-120.ec2.internal\" not found" Apr 21 15:10:34.844796 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.844759 2583 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-120.ec2.internal"] Apr 21 15:10:34.844893 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.844856 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:34.845630 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.845614 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:34.845718 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.845644 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:34.845718 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.845658 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:34.847121 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.847107 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:34.847271 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.847257 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal" Apr 21 15:10:34.847328 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.847286 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:34.847749 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.847735 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:34.847830 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.847735 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:34.847830 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.847761 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:34.847830 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.847786 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:34.847830 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.847796 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:34.847830 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.847813 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:34.849050 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.849035 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-120.ec2.internal" Apr 21 15:10:34.849105 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.849066 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:34.849667 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.849654 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:34.849718 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.849681 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:34.849718 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.849692 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:34.864281 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:34.864266 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-120.ec2.internal\" not found" node="ip-10-0-143-120.ec2.internal" Apr 21 15:10:34.868431 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:34.868416 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-120.ec2.internal\" not found" node="ip-10-0-143-120.ec2.internal" Apr 21 15:10:34.928481 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:34.928451 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-120.ec2.internal\" not found" Apr 21 15:10:34.942704 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.942681 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e86a3d6d8eccc7f6cc8d288787df4758-config\") pod \"kube-apiserver-proxy-ip-10-0-143-120.ec2.internal\" (UID: \"e86a3d6d8eccc7f6cc8d288787df4758\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-120.ec2.internal" Apr 21 15:10:34.942838 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.942710 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7731dc1999905d873408b3b927eba2ce-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal\" (UID: \"7731dc1999905d873408b3b927eba2ce\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal" Apr 21 15:10:34.942838 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:34.942729 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7731dc1999905d873408b3b927eba2ce-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal\" (UID: \"7731dc1999905d873408b3b927eba2ce\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal" Apr 21 15:10:35.029161 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:35.029135 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-120.ec2.internal\" not found" Apr 21 15:10:35.043816 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.043790 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e86a3d6d8eccc7f6cc8d288787df4758-config\") pod \"kube-apiserver-proxy-ip-10-0-143-120.ec2.internal\" (UID: \"e86a3d6d8eccc7f6cc8d288787df4758\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-120.ec2.internal" Apr 21 15:10:35.043816 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.043801 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e86a3d6d8eccc7f6cc8d288787df4758-config\") pod \"kube-apiserver-proxy-ip-10-0-143-120.ec2.internal\" (UID: \"e86a3d6d8eccc7f6cc8d288787df4758\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-120.ec2.internal" Apr 21 15:10:35.043919 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.043836 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7731dc1999905d873408b3b927eba2ce-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal\" (UID: \"7731dc1999905d873408b3b927eba2ce\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal" Apr 21 15:10:35.043919 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.043854 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7731dc1999905d873408b3b927eba2ce-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal\" (UID: \"7731dc1999905d873408b3b927eba2ce\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal" Apr 21 15:10:35.043919 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.043883 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7731dc1999905d873408b3b927eba2ce-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal\" (UID: \"7731dc1999905d873408b3b927eba2ce\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal" Apr 21 15:10:35.044018 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.043920 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7731dc1999905d873408b3b927eba2ce-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal\" (UID: \"7731dc1999905d873408b3b927eba2ce\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal" Apr 21 15:10:35.129517 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:35.129429 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-120.ec2.internal\" not found" Apr 21 15:10:35.166969 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.166931 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal" Apr 21 15:10:35.170799 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.170764 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-120.ec2.internal" Apr 21 15:10:35.229548 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:35.229497 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-120.ec2.internal\" not found" Apr 21 15:10:35.330032 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:35.330011 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-120.ec2.internal\" not found" Apr 21 15:10:35.430595 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:35.430533 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-120.ec2.internal\" not found" Apr 21 15:10:35.531168 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:35.531142 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-120.ec2.internal\" not found" Apr 21 15:10:35.531768 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.531167 2583 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 15:10:35.531768 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.531306 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 15:10:35.531768 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.531343 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 15:10:35.595937 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.595913 2583 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:35.616311 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.616288 2583 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:35.634912 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.634876 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 15:05:34 +0000 UTC" deadline="2028-01-06 23:40:01.014758802 +0000 UTC" Apr 21 15:10:35.634912 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.634908 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15008h29m25.379854665s" Apr 21 15:10:35.641253 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.641237 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 15:10:35.641369 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.641239 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal" Apr 21 15:10:35.663380 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.663208 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:10:35.664474 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.664452 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-120.ec2.internal" Apr 21 15:10:35.667381 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.667361 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:10:35.678360 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.678342 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:10:35.705215 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.705154 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9lmn8" Apr 21 15:10:35.716189 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.716167 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9lmn8" Apr 21 15:10:35.752913 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:35.752884 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode86a3d6d8eccc7f6cc8d288787df4758.slice/crio-35e25bebcf79a418bd27478c82c1decf9fbc280b7f84fc2021e19933c673d46d WatchSource:0}: Error finding container 35e25bebcf79a418bd27478c82c1decf9fbc280b7f84fc2021e19933c673d46d: Status 404 returned error can't find the container with id 35e25bebcf79a418bd27478c82c1decf9fbc280b7f84fc2021e19933c673d46d Apr 21 15:10:35.753421 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:35.753402 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7731dc1999905d873408b3b927eba2ce.slice/crio-67af0a68a3fb3bc3ab18e115b513afb5272e5af053fa13578e1ee4bbc1d22317 WatchSource:0}: Error finding container 67af0a68a3fb3bc3ab18e115b513afb5272e5af053fa13578e1ee4bbc1d22317: Status 404 returned error can't find the container with id 67af0a68a3fb3bc3ab18e115b513afb5272e5af053fa13578e1ee4bbc1d22317 Apr 21 15:10:35.759510 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:35.759496 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:10:36.609077 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.609047 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:36.612903 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.612883 2583 apiserver.go:52] "Watching apiserver" Apr 21 15:10:36.621169 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.621146 2583 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 15:10:36.621552 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.621529 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-4zr9g","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph","openshift-dns/node-resolver-g64wd","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal","openshift-multus/multus-additional-cni-plugins-g2z5r","openshift-multus/multus-j58rf","openshift-network-operator/iptables-alerter-7n5n5","kube-system/kube-apiserver-proxy-ip-10-0-143-120.ec2.internal","openshift-cluster-node-tuning-operator/tuned-68mrx","openshift-image-registry/node-ca-bhzt8","openshift-multus/network-metrics-daemon-wpmpv","openshift-network-diagnostics/network-check-target-bhk2h","openshift-ovn-kubernetes/ovnkube-node-bf2p7","kube-system/global-pull-secret-syncer-b8vgq"] Apr 21 15:10:36.623039 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.623010 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bhzt8" Apr 21 15:10:36.624242 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.624219 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.625381 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.625359 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g64wd" Apr 21 15:10:36.626147 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.626127 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4dxv9\"" Apr 21 15:10:36.626237 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.626152 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 15:10:36.626580 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.626542 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:36.626649 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:36.626621 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:10:36.627021 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.627003 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 15:10:36.627453 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.627433 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 15:10:36.627535 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.627503 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 15:10:36.627927 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.627878 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.628185 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.628156 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 15:10:36.628272 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.628190 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 15:10:36.628272 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.628211 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 15:10:36.628375 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.628339 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 15:10:36.628671 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.628653 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 15:10:36.628759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.628718 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-h92dl\"" Apr 21 15:10:36.628835 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.628754 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 15:10:36.628835 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.628720 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-kf7kv\"" Apr 21 15:10:36.628929 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.628860 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 15:10:36.629583 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.629565 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.629970 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.629951 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 15:10:36.630538 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.630509 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 15:10:36.630633 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.630520 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 15:10:36.631076 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.630946 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 15:10:36.631076 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.630976 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wpfg2\"" Apr 21 15:10:36.631076 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.630979 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 15:10:36.631606 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.631586 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-j4nnx\"" Apr 21 15:10:36.631883 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.631865 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 15:10:36.633009 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.632991 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.634306 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.634178 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7n5n5" Apr 21 15:10:36.635274 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.635197 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-g5fts\"" Apr 21 15:10:36.635653 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.635446 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.635653 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.635580 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:10:36.636065 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.636046 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 15:10:36.636837 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.636820 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4zr9g" Apr 21 15:10:36.637122 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.637102 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:10:36.637364 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.637345 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 15:10:36.638034 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.637981 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 15:10:36.638456 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.638441 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:36.638531 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:36.638507 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:10:36.639898 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.639870 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 15:10:36.639995 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.639955 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 15:10:36.640058 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.639870 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9rlwd\"" Apr 21 15:10:36.640268 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.640250 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:36.640350 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:36.640312 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:10:36.640914 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.640834 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 15:10:36.640914 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.640891 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kb2k2\"" Apr 21 15:10:36.641060 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.640946 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-596kz\"" Apr 21 15:10:36.641060 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.640969 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 15:10:36.641060 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.640894 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 15:10:36.645374 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.642929 2583 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 15:10:36.652603 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.652581 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-run-ovn-kubernetes\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.652691 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.652618 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-cni-bin\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.652691 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.652635 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93dbff63-b3ad-4508-8f23-3d4394458b3b-tmp\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.652691 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.652659 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2a989bd5-4d7c-4917-b441-576b61407d76-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.652691 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.652677 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-var-lib-cni-multus\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.652938 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.652745 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-hostroot\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.652938 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.652800 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-sys-fs\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.652938 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.652826 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f83ccf86-3867-4183-a1d2-d8fb6871e584-serviceca\") pod \"node-ca-bhzt8\" (UID: \"f83ccf86-3867-4183-a1d2-d8fb6871e584\") " pod="openshift-image-registry/node-ca-bhzt8" Apr 21 15:10:36.652938 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.652850 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-var-lib-openvswitch\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.652938 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.652895 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-env-overrides\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.653148 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.652942 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-kubernetes\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.653148 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.652979 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-sysctl-d\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.653148 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653019 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-sys\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.653148 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653050 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a989bd5-4d7c-4917-b441-576b61407d76-system-cni-dir\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.653148 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653080 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a989bd5-4d7c-4917-b441-576b61407d76-cni-binary-copy\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.653148 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653112 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ecffb9c8-2d5e-409b-8013-126edd86ac8a-cni-binary-copy\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.653148 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653141 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-multus-conf-dir\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.653420 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653166 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzbvn\" (UniqueName: \"kubernetes.io/projected/ecffb9c8-2d5e-409b-8013-126edd86ac8a-kube-api-access-lzbvn\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.653420 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653189 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-host\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.653420 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653253 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkj9k\" (UniqueName: \"kubernetes.io/projected/2a989bd5-4d7c-4917-b441-576b61407d76-kube-api-access-zkj9k\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.653420 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653288 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-system-cni-dir\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.653420 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653310 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-cnibin\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.653420 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653341 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-socket-dir\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.653420 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653365 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-slash\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.653420 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653396 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-run-ovn\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.653420 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653409 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-node-log\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.653420 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653422 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-modprobe-d\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653436 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-lib-modules\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653477 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-run-k8s-cni-cncf-io\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653509 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwdv8\" (UniqueName: \"kubernetes.io/projected/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-kube-api-access-wwdv8\") pod \"network-metrics-daemon-wpmpv\" (UID: \"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad\") " pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653535 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzwlb\" (UniqueName: \"kubernetes.io/projected/f83ccf86-3867-4183-a1d2-d8fb6871e584-kube-api-access-wzwlb\") pod \"node-ca-bhzt8\" (UID: \"f83ccf86-3867-4183-a1d2-d8fb6871e584\") " pod="openshift-image-registry/node-ca-bhzt8" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653597 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-systemd-units\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653640 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-run-netns\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653665 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksvwj\" (UniqueName: \"kubernetes.io/projected/572e9b4f-7cd2-40c1-9fe5-538bad3971a7-kube-api-access-ksvwj\") pod \"node-resolver-g64wd\" (UID: \"572e9b4f-7cd2-40c1-9fe5-538bad3971a7\") " pod="openshift-dns/node-resolver-g64wd" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653692 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a989bd5-4d7c-4917-b441-576b61407d76-cnibin\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653717 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6kkw\" (UniqueName: \"kubernetes.io/projected/93dbff63-b3ad-4508-8f23-3d4394458b3b-kube-api-access-f6kkw\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653741 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret\") pod \"global-pull-secret-syncer-b8vgq\" (UID: \"c7fed088-3fe1-4c99-b3f4-37af1f9f317c\") " pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653766 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2a989bd5-4d7c-4917-b441-576b61407d76-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653808 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ecffb9c8-2d5e-409b-8013-126edd86ac8a-multus-daemon-config\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653834 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-etc-selinux\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653856 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-ovnkube-config\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653881 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp5vn\" (UniqueName: \"kubernetes.io/projected/9b3e45df-578e-4456-b850-310c9d4a72fa-kube-api-access-wp5vn\") pod \"iptables-alerter-7n5n5\" (UID: \"9b3e45df-578e-4456-b850-310c9d4a72fa\") " pod="openshift-network-operator/iptables-alerter-7n5n5" Apr 21 15:10:36.654024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653903 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-dbus\") pod \"global-pull-secret-syncer-b8vgq\" (UID: \"c7fed088-3fe1-4c99-b3f4-37af1f9f317c\") " pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653927 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-run-multus-certs\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653951 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-run-systemd\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653974 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-cni-netd\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.653998 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/572e9b4f-7cd2-40c1-9fe5-538bad3971a7-tmp-dir\") pod \"node-resolver-g64wd\" (UID: \"572e9b4f-7cd2-40c1-9fe5-538bad3971a7\") " pod="openshift-dns/node-resolver-g64wd" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654048 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-ovn-node-metrics-cert\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654087 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-ovnkube-script-lib\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654113 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-systemd\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654138 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-var-lib-kubelet\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654160 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs\") pod \"network-metrics-daemon-wpmpv\" (UID: \"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad\") " pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654228 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/572e9b4f-7cd2-40c1-9fe5-538bad3971a7-hosts-file\") pod \"node-resolver-g64wd\" (UID: \"572e9b4f-7cd2-40c1-9fe5-538bad3971a7\") " pod="openshift-dns/node-resolver-g64wd" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654262 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-sysctl-conf\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654302 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-kubelet\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654348 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-run-openvswitch\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654383 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-sysconfig\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654409 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-multus-cni-dir\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654431 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-etc-openvswitch\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.654712 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654464 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-log-socket\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654503 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2hb6\" (UniqueName: \"kubernetes.io/projected/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-kube-api-access-h2hb6\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654528 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9b3e45df-578e-4456-b850-310c9d4a72fa-iptables-alerter-script\") pod \"iptables-alerter-7n5n5\" (UID: \"9b3e45df-578e-4456-b850-310c9d4a72fa\") " pod="openshift-network-operator/iptables-alerter-7n5n5" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654558 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9006254c-0f60-4649-85ee-dee1f0871d1b-agent-certs\") pod \"konnectivity-agent-4zr9g\" (UID: \"9006254c-0f60-4649-85ee-dee1f0871d1b\") " pod="kube-system/konnectivity-agent-4zr9g" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654610 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654646 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-run\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654671 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-etc-kubernetes\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654694 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-device-dir\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654717 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-var-lib-kubelet\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654741 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-kubelet-config\") pod \"global-pull-secret-syncer-b8vgq\" (UID: \"c7fed088-3fe1-4c99-b3f4-37af1f9f317c\") " pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654805 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46m8r\" (UniqueName: \"kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r\") pod \"network-check-target-bhk2h\" (UID: \"a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba\") " pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654850 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-multus-socket-dir-parent\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654876 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9006254c-0f60-4649-85ee-dee1f0871d1b-konnectivity-ca\") pod \"konnectivity-agent-4zr9g\" (UID: \"9006254c-0f60-4649-85ee-dee1f0871d1b\") " pod="kube-system/konnectivity-agent-4zr9g" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654904 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a989bd5-4d7c-4917-b441-576b61407d76-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654928 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-os-release\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654951 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f83ccf86-3867-4183-a1d2-d8fb6871e584-host\") pod \"node-ca-bhzt8\" (UID: \"f83ccf86-3867-4183-a1d2-d8fb6871e584\") " pod="openshift-image-registry/node-ca-bhzt8" Apr 21 15:10:36.655554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654971 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-tuned\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.656128 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.654993 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a989bd5-4d7c-4917-b441-576b61407d76-os-release\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.656128 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.655015 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-var-lib-cni-bin\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.656128 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.655038 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-registration-dir\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.656128 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.655060 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5qbn\" (UniqueName: \"kubernetes.io/projected/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-kube-api-access-p5qbn\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.656128 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.655084 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b3e45df-578e-4456-b850-310c9d4a72fa-host-slash\") pod \"iptables-alerter-7n5n5\" (UID: \"9b3e45df-578e-4456-b850-310c9d4a72fa\") " pod="openshift-network-operator/iptables-alerter-7n5n5" Apr 21 15:10:36.656128 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.655107 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-run-netns\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.656128 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.655130 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.716819 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.716768 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:05:35 +0000 UTC" deadline="2027-09-14 20:23:12.990601589 +0000 UTC" Apr 21 15:10:36.716819 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.716814 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12269h12m36.273791127s" Apr 21 15:10:36.747897 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.747854 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal" event={"ID":"7731dc1999905d873408b3b927eba2ce","Type":"ContainerStarted","Data":"67af0a68a3fb3bc3ab18e115b513afb5272e5af053fa13578e1ee4bbc1d22317"} Apr 21 15:10:36.748882 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.748853 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-120.ec2.internal" event={"ID":"e86a3d6d8eccc7f6cc8d288787df4758","Type":"ContainerStarted","Data":"35e25bebcf79a418bd27478c82c1decf9fbc280b7f84fc2021e19933c673d46d"} Apr 21 15:10:36.755367 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755348 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-run-ovn-kubernetes\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.755484 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755373 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-cni-bin\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.755484 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755391 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93dbff63-b3ad-4508-8f23-3d4394458b3b-tmp\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.755484 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755408 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2a989bd5-4d7c-4917-b441-576b61407d76-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.755484 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755431 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-var-lib-cni-multus\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.755484 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755455 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-hostroot\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.755484 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755478 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-sys-fs\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.755787 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755481 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-cni-bin\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.755787 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755502 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f83ccf86-3867-4183-a1d2-d8fb6871e584-serviceca\") pod \"node-ca-bhzt8\" (UID: \"f83ccf86-3867-4183-a1d2-d8fb6871e584\") " pod="openshift-image-registry/node-ca-bhzt8" Apr 21 15:10:36.755787 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755526 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-var-lib-openvswitch\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.755787 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755543 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-hostroot\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.755787 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755566 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-env-overrides\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.755787 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755568 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-var-lib-cni-multus\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.755787 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755545 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:36.755787 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755607 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-kubernetes\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.755787 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755632 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-sysctl-d\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.755787 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755658 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-sys\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.755787 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755684 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a989bd5-4d7c-4917-b441-576b61407d76-system-cni-dir\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.755787 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755709 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a989bd5-4d7c-4917-b441-576b61407d76-cni-binary-copy\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.755787 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755732 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ecffb9c8-2d5e-409b-8013-126edd86ac8a-cni-binary-copy\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.755787 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755756 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-multus-conf-dir\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755763 2583 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755799 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzbvn\" (UniqueName: \"kubernetes.io/projected/ecffb9c8-2d5e-409b-8013-126edd86ac8a-kube-api-access-lzbvn\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755808 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-sys-fs\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755823 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-host\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755848 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkj9k\" (UniqueName: \"kubernetes.io/projected/2a989bd5-4d7c-4917-b441-576b61407d76-kube-api-access-zkj9k\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755856 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a989bd5-4d7c-4917-b441-576b61407d76-system-cni-dir\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755871 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-system-cni-dir\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755895 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-cnibin\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755903 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-kubernetes\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755921 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-socket-dir\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755947 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-slash\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755970 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-run-ovn\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.755994 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-node-log\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756018 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-modprobe-d\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756027 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-sysctl-d\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756047 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-lib-modules\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756062 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-env-overrides\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.756432 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756071 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-run-k8s-cni-cncf-io\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756099 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwdv8\" (UniqueName: \"kubernetes.io/projected/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-kube-api-access-wwdv8\") pod \"network-metrics-daemon-wpmpv\" (UID: \"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad\") " pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756111 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-var-lib-openvswitch\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756121 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2a989bd5-4d7c-4917-b441-576b61407d76-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756217 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-socket-dir\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756124 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzwlb\" (UniqueName: \"kubernetes.io/projected/f83ccf86-3867-4183-a1d2-d8fb6871e584-kube-api-access-wzwlb\") pod \"node-ca-bhzt8\" (UID: \"f83ccf86-3867-4183-a1d2-d8fb6871e584\") " pod="openshift-image-registry/node-ca-bhzt8" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756264 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-slash\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756283 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-systemd-units\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756307 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-run-ovn\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756348 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-node-log\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756352 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-run-netns\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756381 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksvwj\" (UniqueName: \"kubernetes.io/projected/572e9b4f-7cd2-40c1-9fe5-538bad3971a7-kube-api-access-ksvwj\") pod \"node-resolver-g64wd\" (UID: \"572e9b4f-7cd2-40c1-9fe5-538bad3971a7\") " pod="openshift-dns/node-resolver-g64wd" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756403 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a989bd5-4d7c-4917-b441-576b61407d76-cnibin\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756426 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6kkw\" (UniqueName: \"kubernetes.io/projected/93dbff63-b3ad-4508-8f23-3d4394458b3b-kube-api-access-f6kkw\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756451 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret\") pod \"global-pull-secret-syncer-b8vgq\" (UID: \"c7fed088-3fe1-4c99-b3f4-37af1f9f317c\") " pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756460 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-lib-modules\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756480 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2a989bd5-4d7c-4917-b441-576b61407d76-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.757256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756503 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-run-k8s-cni-cncf-io\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756509 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ecffb9c8-2d5e-409b-8013-126edd86ac8a-multus-daemon-config\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756537 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-etc-selinux\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756560 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-ovnkube-config\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756587 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp5vn\" (UniqueName: \"kubernetes.io/projected/9b3e45df-578e-4456-b850-310c9d4a72fa-kube-api-access-wp5vn\") pod \"iptables-alerter-7n5n5\" (UID: \"9b3e45df-578e-4456-b850-310c9d4a72fa\") " pod="openshift-network-operator/iptables-alerter-7n5n5" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756612 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-dbus\") pod \"global-pull-secret-syncer-b8vgq\" (UID: \"c7fed088-3fe1-4c99-b3f4-37af1f9f317c\") " pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756618 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-run-ovn-kubernetes\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756638 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-run-multus-certs\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756664 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-run-systemd\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756687 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-cni-netd\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756713 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/572e9b4f-7cd2-40c1-9fe5-538bad3971a7-tmp-dir\") pod \"node-resolver-g64wd\" (UID: \"572e9b4f-7cd2-40c1-9fe5-538bad3971a7\") " pod="openshift-dns/node-resolver-g64wd" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756737 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-ovn-node-metrics-cert\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756073 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-sys\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756803 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-ovnkube-script-lib\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756833 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-systemd\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756835 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-host\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756830 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-system-cni-dir\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.758036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756859 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-var-lib-kubelet\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756885 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a989bd5-4d7c-4917-b441-576b61407d76-cni-binary-copy\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756892 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs\") pod \"network-metrics-daemon-wpmpv\" (UID: \"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad\") " pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756912 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-run-netns\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756919 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/572e9b4f-7cd2-40c1-9fe5-538bad3971a7-hosts-file\") pod \"node-resolver-g64wd\" (UID: \"572e9b4f-7cd2-40c1-9fe5-538bad3971a7\") " pod="openshift-dns/node-resolver-g64wd" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756946 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-sysctl-conf\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756963 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-cnibin\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756973 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-kubelet\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756998 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-run-openvswitch\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757045 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-sysconfig\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757074 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-multus-cni-dir\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757087 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-dbus\") pod \"global-pull-secret-syncer-b8vgq\" (UID: \"c7fed088-3fe1-4c99-b3f4-37af1f9f317c\") " pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757102 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-etc-openvswitch\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757136 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-run-multus-certs\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756749 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f83ccf86-3867-4183-a1d2-d8fb6871e584-serviceca\") pod \"node-ca-bhzt8\" (UID: \"f83ccf86-3867-4183-a1d2-d8fb6871e584\") " pod="openshift-image-registry/node-ca-bhzt8" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757159 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a989bd5-4d7c-4917-b441-576b61407d76-cnibin\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757189 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ecffb9c8-2d5e-409b-8013-126edd86ac8a-cni-binary-copy\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757175 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-log-socket\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.758759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757207 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-etc-openvswitch\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757224 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2hb6\" (UniqueName: \"kubernetes.io/projected/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-kube-api-access-h2hb6\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757245 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-multus-conf-dir\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757251 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9b3e45df-578e-4456-b850-310c9d4a72fa-iptables-alerter-script\") pod \"iptables-alerter-7n5n5\" (UID: \"9b3e45df-578e-4456-b850-310c9d4a72fa\") " pod="openshift-network-operator/iptables-alerter-7n5n5" Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757276 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9006254c-0f60-4649-85ee-dee1f0871d1b-agent-certs\") pod \"konnectivity-agent-4zr9g\" (UID: \"9006254c-0f60-4649-85ee-dee1f0871d1b\") " pod="kube-system/konnectivity-agent-4zr9g" Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757369 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ecffb9c8-2d5e-409b-8013-126edd86ac8a-multus-daemon-config\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757396 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-sysctl-conf\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757438 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2a989bd5-4d7c-4917-b441-576b61407d76-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757452 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757497 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-run\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757522 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-etc-kubernetes\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757550 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-device-dir\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757575 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-var-lib-kubelet\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:36.757595 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757647 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-kubelet-config\") pod \"global-pull-secret-syncer-b8vgq\" (UID: \"c7fed088-3fe1-4c99-b3f4-37af1f9f317c\") " pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757683 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/572e9b4f-7cd2-40c1-9fe5-538bad3971a7-tmp-dir\") pod \"node-resolver-g64wd\" (UID: \"572e9b4f-7cd2-40c1-9fe5-538bad3971a7\") " pod="openshift-dns/node-resolver-g64wd" Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:36.757695 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret podName:c7fed088-3fe1-4c99-b3f4-37af1f9f317c nodeName:}" failed. No retries permitted until 2026-04-21 15:10:37.257644954 +0000 UTC m=+3.137416745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret") pod "global-pull-secret-syncer-b8vgq" (UID: "c7fed088-3fe1-4c99-b3f4-37af1f9f317c") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:36.759582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757598 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-kubelet-config\") pod \"global-pull-secret-syncer-b8vgq\" (UID: \"c7fed088-3fe1-4c99-b3f4-37af1f9f317c\") " pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757723 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-run-openvswitch\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757739 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46m8r\" (UniqueName: \"kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r\") pod \"network-check-target-bhk2h\" (UID: \"a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba\") " pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757749 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-ovnkube-config\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757758 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757768 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-multus-socket-dir-parent\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757813 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-etc-kubernetes\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757815 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-modprobe-d\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757828 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-run\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757853 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-var-lib-kubelet\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757856 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-device-dir\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757851 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-etc-selinux\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.758412 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-run-systemd\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.758259 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-var-lib-kubelet\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.758505 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-cni-netd\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.758549 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-systemd\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.758790 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9006254c-0f60-4649-85ee-dee1f0871d1b-konnectivity-ca\") pod \"konnectivity-agent-4zr9g\" (UID: \"9006254c-0f60-4649-85ee-dee1f0871d1b\") " pod="kube-system/konnectivity-agent-4zr9g" Apr 21 15:10:36.760349 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:36.758799 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:36.758865 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs podName:3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad nodeName:}" failed. No retries permitted until 2026-04-21 15:10:37.258845832 +0000 UTC m=+3.138617636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs") pod "network-metrics-daemon-wpmpv" (UID: "3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.756833 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-systemd-units\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.758932 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-log-socket\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.758863 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/572e9b4f-7cd2-40c1-9fe5-538bad3971a7-hosts-file\") pod \"node-resolver-g64wd\" (UID: \"572e9b4f-7cd2-40c1-9fe5-538bad3971a7\") " pod="openshift-dns/node-resolver-g64wd" Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.759057 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-multus-cni-dir\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757830 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9006254c-0f60-4649-85ee-dee1f0871d1b-konnectivity-ca\") pod \"konnectivity-agent-4zr9g\" (UID: \"9006254c-0f60-4649-85ee-dee1f0871d1b\") " pod="kube-system/konnectivity-agent-4zr9g" Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.759249 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a989bd5-4d7c-4917-b441-576b61407d76-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.759317 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-os-release\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.759336 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-ovnkube-script-lib\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.759352 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f83ccf86-3867-4183-a1d2-d8fb6871e584-host\") pod \"node-ca-bhzt8\" (UID: \"f83ccf86-3867-4183-a1d2-d8fb6871e584\") " pod="openshift-image-registry/node-ca-bhzt8" Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.759389 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-tuned\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.759436 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a989bd5-4d7c-4917-b441-576b61407d76-os-release\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.759567 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-var-lib-cni-bin\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.759602 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-registration-dir\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.759578 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9b3e45df-578e-4456-b850-310c9d4a72fa-iptables-alerter-script\") pod \"iptables-alerter-7n5n5\" (UID: \"9b3e45df-578e-4456-b850-310c9d4a72fa\") " pod="openshift-network-operator/iptables-alerter-7n5n5" Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.757692 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-host-kubelet\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.761119 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.759634 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5qbn\" (UniqueName: \"kubernetes.io/projected/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-kube-api-access-p5qbn\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.761828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.759712 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f83ccf86-3867-4183-a1d2-d8fb6871e584-host\") pod \"node-ca-bhzt8\" (UID: \"f83ccf86-3867-4183-a1d2-d8fb6871e584\") " pod="openshift-image-registry/node-ca-bhzt8" Apr 21 15:10:36.761828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.759755 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b3e45df-578e-4456-b850-310c9d4a72fa-host-slash\") pod \"iptables-alerter-7n5n5\" (UID: \"9b3e45df-578e-4456-b850-310c9d4a72fa\") " pod="openshift-network-operator/iptables-alerter-7n5n5" Apr 21 15:10:36.761828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.760015 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-run-netns\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.761828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.760169 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-os-release\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.761828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.760185 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.761828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.760224 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-sysconfig\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.761828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.760315 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b3e45df-578e-4456-b850-310c9d4a72fa-host-slash\") pod \"iptables-alerter-7n5n5\" (UID: \"9b3e45df-578e-4456-b850-310c9d4a72fa\") " pod="openshift-network-operator/iptables-alerter-7n5n5" Apr 21 15:10:36.761828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.760368 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-run-netns\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.761828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.760426 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.761828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.760429 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a989bd5-4d7c-4917-b441-576b61407d76-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.761828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.760507 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a989bd5-4d7c-4917-b441-576b61407d76-os-release\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.761828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.760611 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-host-var-lib-cni-bin\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.761828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.760639 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-registration-dir\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.761828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.759995 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ecffb9c8-2d5e-409b-8013-126edd86ac8a-multus-socket-dir-parent\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.761828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.761048 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-ovn-node-metrics-cert\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.762518 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.762321 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93dbff63-b3ad-4508-8f23-3d4394458b3b-tmp\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.765666 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.765643 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkj9k\" (UniqueName: \"kubernetes.io/projected/2a989bd5-4d7c-4917-b441-576b61407d76-kube-api-access-zkj9k\") pod \"multus-additional-cni-plugins-g2z5r\" (UID: \"2a989bd5-4d7c-4917-b441-576b61407d76\") " pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.765666 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.765657 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzwlb\" (UniqueName: \"kubernetes.io/projected/f83ccf86-3867-4183-a1d2-d8fb6871e584-kube-api-access-wzwlb\") pod \"node-ca-bhzt8\" (UID: \"f83ccf86-3867-4183-a1d2-d8fb6871e584\") " pod="openshift-image-registry/node-ca-bhzt8" Apr 21 15:10:36.765836 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.765645 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzbvn\" (UniqueName: \"kubernetes.io/projected/ecffb9c8-2d5e-409b-8013-126edd86ac8a-kube-api-access-lzbvn\") pod \"multus-j58rf\" (UID: \"ecffb9c8-2d5e-409b-8013-126edd86ac8a\") " pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.766057 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.766035 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksvwj\" (UniqueName: \"kubernetes.io/projected/572e9b4f-7cd2-40c1-9fe5-538bad3971a7-kube-api-access-ksvwj\") pod \"node-resolver-g64wd\" (UID: \"572e9b4f-7cd2-40c1-9fe5-538bad3971a7\") " pod="openshift-dns/node-resolver-g64wd" Apr 21 15:10:36.766137 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.766038 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9006254c-0f60-4649-85ee-dee1f0871d1b-agent-certs\") pod \"konnectivity-agent-4zr9g\" (UID: \"9006254c-0f60-4649-85ee-dee1f0871d1b\") " pod="kube-system/konnectivity-agent-4zr9g" Apr 21 15:10:36.767168 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.767142 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp5vn\" (UniqueName: \"kubernetes.io/projected/9b3e45df-578e-4456-b850-310c9d4a72fa-kube-api-access-wp5vn\") pod \"iptables-alerter-7n5n5\" (UID: \"9b3e45df-578e-4456-b850-310c9d4a72fa\") " pod="openshift-network-operator/iptables-alerter-7n5n5" Apr 21 15:10:36.767834 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.767803 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/93dbff63-b3ad-4508-8f23-3d4394458b3b-etc-tuned\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.768120 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.768077 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6kkw\" (UniqueName: \"kubernetes.io/projected/93dbff63-b3ad-4508-8f23-3d4394458b3b-kube-api-access-f6kkw\") pod \"tuned-68mrx\" (UID: \"93dbff63-b3ad-4508-8f23-3d4394458b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.768962 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.768914 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5qbn\" (UniqueName: \"kubernetes.io/projected/db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c-kube-api-access-p5qbn\") pod \"aws-ebs-csi-driver-node-r69ph\" (UID: \"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.769128 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.769109 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwdv8\" (UniqueName: \"kubernetes.io/projected/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-kube-api-access-wwdv8\") pod \"network-metrics-daemon-wpmpv\" (UID: \"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad\") " pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:36.769407 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.769386 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2hb6\" (UniqueName: \"kubernetes.io/projected/e46ae269-acf9-41f8-bfa2-1d7fd1b27c47-kube-api-access-h2hb6\") pod \"ovnkube-node-bf2p7\" (UID: \"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.770427 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:36.770413 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:36.770427 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:36.770430 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:36.770552 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:36.770439 2583 projected.go:194] Error preparing data for projected volume kube-api-access-46m8r for pod openshift-network-diagnostics/network-check-target-bhk2h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:36.770552 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:36.770497 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r podName:a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba nodeName:}" failed. No retries permitted until 2026-04-21 15:10:37.270481446 +0000 UTC m=+3.150253259 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-46m8r" (UniqueName: "kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r") pod "network-check-target-bhk2h" (UID: "a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:36.938638 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.938553 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bhzt8" Apr 21 15:10:36.945477 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.945453 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:10:36.953104 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.953077 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g64wd" Apr 21 15:10:36.957992 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.957972 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g2z5r" Apr 21 15:10:36.964700 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.964684 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j58rf" Apr 21 15:10:36.971261 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.971241 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-68mrx" Apr 21 15:10:36.978762 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.978743 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7n5n5" Apr 21 15:10:36.986281 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.986265 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" Apr 21 15:10:36.991816 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:36.991797 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4zr9g" Apr 21 15:10:37.264499 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.264422 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs\") pod \"network-metrics-daemon-wpmpv\" (UID: \"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad\") " pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:37.264661 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.264517 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret\") pod \"global-pull-secret-syncer-b8vgq\" (UID: \"c7fed088-3fe1-4c99-b3f4-37af1f9f317c\") " pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:37.264661 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:37.264542 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:37.264661 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:37.264596 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:37.264661 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:37.264599 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs podName:3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad nodeName:}" failed. No retries permitted until 2026-04-21 15:10:38.264581237 +0000 UTC m=+4.144353032 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs") pod "network-metrics-daemon-wpmpv" (UID: "3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:37.264661 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:37.264637 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret podName:c7fed088-3fe1-4c99-b3f4-37af1f9f317c nodeName:}" failed. No retries permitted until 2026-04-21 15:10:38.26462516 +0000 UTC m=+4.144396967 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret") pod "global-pull-secret-syncer-b8vgq" (UID: "c7fed088-3fe1-4c99-b3f4-37af1f9f317c") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:37.365343 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.365265 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46m8r\" (UniqueName: \"kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r\") pod \"network-check-target-bhk2h\" (UID: \"a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba\") " pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:37.365441 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:37.365418 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:37.365441 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:37.365445 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:37.365615 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:37.365458 2583 projected.go:194] Error preparing data for projected volume kube-api-access-46m8r for pod openshift-network-diagnostics/network-check-target-bhk2h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:37.365615 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:37.365520 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r podName:a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba nodeName:}" failed. No retries permitted until 2026-04-21 15:10:38.365499628 +0000 UTC m=+4.245271439 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-46m8r" (UniqueName: "kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r") pod "network-check-target-bhk2h" (UID: "a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:37.371090 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:37.371067 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b3e45df_578e_4456_b850_310c9d4a72fa.slice/crio-3f4689e64eabb605f57427fefb88e1b69fe44c0d57cb314487e8d391d71bec12 WatchSource:0}: Error finding container 3f4689e64eabb605f57427fefb88e1b69fe44c0d57cb314487e8d391d71bec12: Status 404 returned error can't find the container with id 3f4689e64eabb605f57427fefb88e1b69fe44c0d57cb314487e8d391d71bec12 Apr 21 15:10:37.372942 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:37.372843 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecffb9c8_2d5e_409b_8013_126edd86ac8a.slice/crio-db0c63dc7469ff9934fa963f048bc7e1b8eae429c9c20a5be33897c268e30dd7 WatchSource:0}: Error finding container db0c63dc7469ff9934fa963f048bc7e1b8eae429c9c20a5be33897c268e30dd7: Status 404 returned error can't find the container with id db0c63dc7469ff9934fa963f048bc7e1b8eae429c9c20a5be33897c268e30dd7 Apr 21 15:10:37.373747 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:37.373722 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode46ae269_acf9_41f8_bfa2_1d7fd1b27c47.slice/crio-430ca5cf6996773ff07b4743869d5ba401aeaaaac0523bfac08435f9368ab20b WatchSource:0}: Error finding container 430ca5cf6996773ff07b4743869d5ba401aeaaaac0523bfac08435f9368ab20b: Status 404 returned error can't find the container with id 430ca5cf6996773ff07b4743869d5ba401aeaaaac0523bfac08435f9368ab20b Apr 21 15:10:37.376830 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:37.376808 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a989bd5_4d7c_4917_b441_576b61407d76.slice/crio-edc3bcb0f0e1f06ca3c9b247023956c15dd4095bab138412eb83c7f25c1aaa9c WatchSource:0}: Error finding container edc3bcb0f0e1f06ca3c9b247023956c15dd4095bab138412eb83c7f25c1aaa9c: Status 404 returned error can't find the container with id edc3bcb0f0e1f06ca3c9b247023956c15dd4095bab138412eb83c7f25c1aaa9c Apr 21 15:10:37.377404 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:37.377377 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod572e9b4f_7cd2_40c1_9fe5_538bad3971a7.slice/crio-bbc42e0c25305ed2255dd74312f79689ddf62db050bb36d8bbe56d6bcabaf589 WatchSource:0}: Error finding container bbc42e0c25305ed2255dd74312f79689ddf62db050bb36d8bbe56d6bcabaf589: Status 404 returned error can't find the container with id bbc42e0c25305ed2255dd74312f79689ddf62db050bb36d8bbe56d6bcabaf589 Apr 21 15:10:37.379607 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:37.379537 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93dbff63_b3ad_4508_8f23_3d4394458b3b.slice/crio-b05643ea95dd0ad853e1ecff1b412a3a3d36d94e22fa9b21a89c21ef6c1ed096 WatchSource:0}: Error finding container b05643ea95dd0ad853e1ecff1b412a3a3d36d94e22fa9b21a89c21ef6c1ed096: Status 404 returned error can't find the container with id b05643ea95dd0ad853e1ecff1b412a3a3d36d94e22fa9b21a89c21ef6c1ed096 Apr 21 15:10:37.381486 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:37.381463 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9006254c_0f60_4649_85ee_dee1f0871d1b.slice/crio-734884329d8e03418466f7fa4d01b46cfac8673e6bbf9e591cbc1dae73049d3e WatchSource:0}: Error finding container 734884329d8e03418466f7fa4d01b46cfac8673e6bbf9e591cbc1dae73049d3e: Status 404 returned error can't find the container with id 734884329d8e03418466f7fa4d01b46cfac8673e6bbf9e591cbc1dae73049d3e Apr 21 15:10:37.382193 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:37.382121 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf83ccf86_3867_4183_a1d2_d8fb6871e584.slice/crio-cffc1e125bd04cf9a0e8b4468b062cda9e62816f56ccd3f46caea9fbb9bd9b40 WatchSource:0}: Error finding container cffc1e125bd04cf9a0e8b4468b062cda9e62816f56ccd3f46caea9fbb9bd9b40: Status 404 returned error can't find the container with id cffc1e125bd04cf9a0e8b4468b062cda9e62816f56ccd3f46caea9fbb9bd9b40 Apr 21 15:10:37.383901 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:10:37.383872 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb07dbb0_6d9b_4e68_b76f_a4fd4f68fb8c.slice/crio-d62bec8f7ca9436ef14365eac0725583261b597af36c94e330b8413e9db8f640 WatchSource:0}: Error finding container d62bec8f7ca9436ef14365eac0725583261b597af36c94e330b8413e9db8f640: Status 404 returned error can't find the container with id d62bec8f7ca9436ef14365eac0725583261b597af36c94e330b8413e9db8f640 Apr 21 15:10:37.717935 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.717863 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:05:35 +0000 UTC" deadline="2027-11-09 11:04:59.837965545 +0000 UTC" Apr 21 15:10:37.717935 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.717891 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13603h54m22.120076518s" Apr 21 15:10:37.745256 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.745225 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:37.745410 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.745225 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:37.745473 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:37.745387 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:10:37.745473 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:37.745450 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:10:37.756795 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.756522 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-120.ec2.internal" event={"ID":"e86a3d6d8eccc7f6cc8d288787df4758","Type":"ContainerStarted","Data":"a63c57b41ce356ed42bdd4987158062e47e723967ef384d954fe62d5238ff3ce"} Apr 21 15:10:37.758541 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.758513 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bhzt8" event={"ID":"f83ccf86-3867-4183-a1d2-d8fb6871e584","Type":"ContainerStarted","Data":"cffc1e125bd04cf9a0e8b4468b062cda9e62816f56ccd3f46caea9fbb9bd9b40"} Apr 21 15:10:37.763238 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.763192 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4zr9g" event={"ID":"9006254c-0f60-4649-85ee-dee1f0871d1b","Type":"ContainerStarted","Data":"734884329d8e03418466f7fa4d01b46cfac8673e6bbf9e591cbc1dae73049d3e"} Apr 21 15:10:37.771407 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.771380 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-68mrx" event={"ID":"93dbff63-b3ad-4508-8f23-3d4394458b3b","Type":"ContainerStarted","Data":"b05643ea95dd0ad853e1ecff1b412a3a3d36d94e22fa9b21a89c21ef6c1ed096"} Apr 21 15:10:37.772144 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.772098 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-120.ec2.internal" podStartSLOduration=2.772083051 podStartE2EDuration="2.772083051s" podCreationTimestamp="2026-04-21 15:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:10:37.771544185 +0000 UTC m=+3.651316002" watchObservedRunningTime="2026-04-21 15:10:37.772083051 +0000 UTC m=+3.651854865" Apr 21 15:10:37.773577 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.773534 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2z5r" event={"ID":"2a989bd5-4d7c-4917-b441-576b61407d76","Type":"ContainerStarted","Data":"edc3bcb0f0e1f06ca3c9b247023956c15dd4095bab138412eb83c7f25c1aaa9c"} Apr 21 15:10:37.779046 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.778999 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7n5n5" event={"ID":"9b3e45df-578e-4456-b850-310c9d4a72fa","Type":"ContainerStarted","Data":"3f4689e64eabb605f57427fefb88e1b69fe44c0d57cb314487e8d391d71bec12"} Apr 21 15:10:37.785609 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.785584 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" event={"ID":"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c","Type":"ContainerStarted","Data":"d62bec8f7ca9436ef14365eac0725583261b597af36c94e330b8413e9db8f640"} Apr 21 15:10:37.796314 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.795613 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g64wd" event={"ID":"572e9b4f-7cd2-40c1-9fe5-538bad3971a7","Type":"ContainerStarted","Data":"bbc42e0c25305ed2255dd74312f79689ddf62db050bb36d8bbe56d6bcabaf589"} Apr 21 15:10:37.798265 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.798241 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" event={"ID":"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47","Type":"ContainerStarted","Data":"430ca5cf6996773ff07b4743869d5ba401aeaaaac0523bfac08435f9368ab20b"} Apr 21 15:10:37.800528 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:37.800505 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j58rf" event={"ID":"ecffb9c8-2d5e-409b-8013-126edd86ac8a","Type":"ContainerStarted","Data":"db0c63dc7469ff9934fa963f048bc7e1b8eae429c9c20a5be33897c268e30dd7"} Apr 21 15:10:38.275147 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:38.275108 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret\") pod \"global-pull-secret-syncer-b8vgq\" (UID: \"c7fed088-3fe1-4c99-b3f4-37af1f9f317c\") " pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:38.275430 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:38.275172 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs\") pod \"network-metrics-daemon-wpmpv\" (UID: \"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad\") " pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:38.275430 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:38.275322 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:38.275430 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:38.275380 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs podName:3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad nodeName:}" failed. No retries permitted until 2026-04-21 15:10:40.275363474 +0000 UTC m=+6.155135281 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs") pod "network-metrics-daemon-wpmpv" (UID: "3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:38.275803 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:38.275768 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:38.275894 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:38.275835 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret podName:c7fed088-3fe1-4c99-b3f4-37af1f9f317c nodeName:}" failed. No retries permitted until 2026-04-21 15:10:40.27582032 +0000 UTC m=+6.155592126 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret") pod "global-pull-secret-syncer-b8vgq" (UID: "c7fed088-3fe1-4c99-b3f4-37af1f9f317c") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:38.376556 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:38.375987 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46m8r\" (UniqueName: \"kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r\") pod \"network-check-target-bhk2h\" (UID: \"a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba\") " pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:38.376556 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:38.376138 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:38.376556 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:38.376157 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:38.376556 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:38.376168 2583 projected.go:194] Error preparing data for projected volume kube-api-access-46m8r for pod openshift-network-diagnostics/network-check-target-bhk2h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:38.376556 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:38.376223 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r podName:a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba nodeName:}" failed. No retries permitted until 2026-04-21 15:10:40.37620586 +0000 UTC m=+6.255977658 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-46m8r" (UniqueName: "kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r") pod "network-check-target-bhk2h" (UID: "a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:38.744935 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:38.744860 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:38.745355 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:38.745015 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:10:38.836460 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:38.836421 2583 generic.go:358] "Generic (PLEG): container finished" podID="7731dc1999905d873408b3b927eba2ce" containerID="bb7e93343c16739c103156faad78b94289f90b93f76cd8108ca23c68f06d3e51" exitCode=0 Apr 21 15:10:38.837389 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:38.837363 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal" event={"ID":"7731dc1999905d873408b3b927eba2ce","Type":"ContainerDied","Data":"bb7e93343c16739c103156faad78b94289f90b93f76cd8108ca23c68f06d3e51"} Apr 21 15:10:39.746033 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:39.745075 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:39.746033 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:39.745430 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:10:39.746033 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:39.745892 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:39.746033 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:39.745983 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:10:39.842436 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:39.841844 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal" event={"ID":"7731dc1999905d873408b3b927eba2ce","Type":"ContainerStarted","Data":"21051975b170235f62c629576a9e0b6bead9a9173b90c10300b7efaf4c13fe9e"} Apr 21 15:10:40.290554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:40.290509 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret\") pod \"global-pull-secret-syncer-b8vgq\" (UID: \"c7fed088-3fe1-4c99-b3f4-37af1f9f317c\") " pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:40.290737 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:40.290564 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs\") pod \"network-metrics-daemon-wpmpv\" (UID: \"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad\") " pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:40.290737 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:40.290713 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:40.290889 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:40.290790 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs podName:3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad nodeName:}" failed. No retries permitted until 2026-04-21 15:10:44.290755675 +0000 UTC m=+10.170527481 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs") pod "network-metrics-daemon-wpmpv" (UID: "3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:40.291202 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:40.291184 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:40.291284 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:40.291235 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret podName:c7fed088-3fe1-4c99-b3f4-37af1f9f317c nodeName:}" failed. No retries permitted until 2026-04-21 15:10:44.291221952 +0000 UTC m=+10.170993745 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret") pod "global-pull-secret-syncer-b8vgq" (UID: "c7fed088-3fe1-4c99-b3f4-37af1f9f317c") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:40.391823 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:40.391783 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46m8r\" (UniqueName: \"kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r\") pod \"network-check-target-bhk2h\" (UID: \"a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba\") " pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:40.392020 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:40.391953 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:40.392020 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:40.391979 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:40.392020 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:40.392015 2583 projected.go:194] Error preparing data for projected volume kube-api-access-46m8r for pod openshift-network-diagnostics/network-check-target-bhk2h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:40.392191 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:40.392074 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r podName:a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba nodeName:}" failed. No retries permitted until 2026-04-21 15:10:44.39205536 +0000 UTC m=+10.271827167 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-46m8r" (UniqueName: "kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r") pod "network-check-target-bhk2h" (UID: "a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:40.744639 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:40.744560 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:40.744810 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:40.744722 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:10:41.744575 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:41.744546 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:41.744575 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:41.744581 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:41.745106 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:41.744668 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:10:41.745106 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:41.744786 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:10:42.744764 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:42.744713 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:42.745203 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:42.744892 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:10:43.745187 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:43.745088 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:43.745643 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:43.745092 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:43.745643 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:43.745288 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:10:43.745643 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:43.745357 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:10:44.324789 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:44.324658 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs\") pod \"network-metrics-daemon-wpmpv\" (UID: \"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad\") " pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:44.324789 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:44.324751 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret\") pod \"global-pull-secret-syncer-b8vgq\" (UID: \"c7fed088-3fe1-4c99-b3f4-37af1f9f317c\") " pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:44.324988 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:44.324877 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:44.324988 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:44.324901 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:44.324988 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:44.324957 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs podName:3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad nodeName:}" failed. No retries permitted until 2026-04-21 15:10:52.324936606 +0000 UTC m=+18.204708407 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs") pod "network-metrics-daemon-wpmpv" (UID: "3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:44.324988 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:44.324977 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret podName:c7fed088-3fe1-4c99-b3f4-37af1f9f317c nodeName:}" failed. No retries permitted until 2026-04-21 15:10:52.324967128 +0000 UTC m=+18.204738935 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret") pod "global-pull-secret-syncer-b8vgq" (UID: "c7fed088-3fe1-4c99-b3f4-37af1f9f317c") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:44.425988 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:44.425925 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46m8r\" (UniqueName: \"kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r\") pod \"network-check-target-bhk2h\" (UID: \"a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba\") " pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:44.426181 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:44.426125 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:44.426181 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:44.426144 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:44.426181 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:44.426155 2583 projected.go:194] Error preparing data for projected volume kube-api-access-46m8r for pod openshift-network-diagnostics/network-check-target-bhk2h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:44.426322 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:44.426212 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r podName:a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba nodeName:}" failed. No retries permitted until 2026-04-21 15:10:52.426193117 +0000 UTC m=+18.305964921 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-46m8r" (UniqueName: "kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r") pod "network-check-target-bhk2h" (UID: "a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:44.745308 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:44.745279 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:44.745743 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:44.745380 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:10:45.744485 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:45.744450 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:45.744666 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:45.744450 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:45.744666 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:45.744587 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:10:45.744666 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:45.744637 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:10:46.745213 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:46.745138 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:46.745610 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:46.745272 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:10:47.744406 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:47.744376 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:47.744579 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:47.744376 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:47.744579 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:47.744478 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:10:47.744579 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:47.744533 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:10:48.744868 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:48.744832 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:48.745294 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:48.744979 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:10:49.744443 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:49.744409 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:49.744443 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:49.744449 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:49.744640 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:49.744530 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:10:49.744680 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:49.744652 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:10:50.744922 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:50.744888 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:50.745355 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:50.745029 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:10:51.744221 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:51.744191 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:51.744542 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:51.744191 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:51.744542 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:51.744295 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:10:51.744542 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:51.744377 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:10:52.383646 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:52.383611 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs\") pod \"network-metrics-daemon-wpmpv\" (UID: \"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad\") " pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:52.384096 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:52.383699 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret\") pod \"global-pull-secret-syncer-b8vgq\" (UID: \"c7fed088-3fe1-4c99-b3f4-37af1f9f317c\") " pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:52.384096 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:52.383786 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:52.384096 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:52.383800 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:52.384096 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:52.383862 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret podName:c7fed088-3fe1-4c99-b3f4-37af1f9f317c nodeName:}" failed. No retries permitted until 2026-04-21 15:11:08.383844376 +0000 UTC m=+34.263616184 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret") pod "global-pull-secret-syncer-b8vgq" (UID: "c7fed088-3fe1-4c99-b3f4-37af1f9f317c") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:52.384096 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:52.383879 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs podName:3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad nodeName:}" failed. No retries permitted until 2026-04-21 15:11:08.383870493 +0000 UTC m=+34.263642284 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs") pod "network-metrics-daemon-wpmpv" (UID: "3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:52.485004 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:52.484958 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46m8r\" (UniqueName: \"kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r\") pod \"network-check-target-bhk2h\" (UID: \"a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba\") " pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:52.485174 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:52.485146 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:52.485174 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:52.485171 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:52.485277 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:52.485184 2583 projected.go:194] Error preparing data for projected volume kube-api-access-46m8r for pod openshift-network-diagnostics/network-check-target-bhk2h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:52.485277 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:52.485247 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r podName:a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba nodeName:}" failed. No retries permitted until 2026-04-21 15:11:08.485229274 +0000 UTC m=+34.365001080 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-46m8r" (UniqueName: "kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r") pod "network-check-target-bhk2h" (UID: "a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:52.744899 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:52.744806 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:52.745059 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:52.744979 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:10:53.744580 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:53.744550 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:53.745012 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:53.744559 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:53.745012 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:53.744668 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:10:53.745012 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:53.744761 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:10:54.745458 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:54.745432 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:54.746299 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:54.745555 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:10:54.869684 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:54.869449 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bhzt8" event={"ID":"f83ccf86-3867-4183-a1d2-d8fb6871e584","Type":"ContainerStarted","Data":"3984fc7c2f746fa6dd341d33a6a29600f4233af213abdd5844f2c363c43c9ec1"} Apr 21 15:10:54.871707 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:54.871275 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4zr9g" event={"ID":"9006254c-0f60-4649-85ee-dee1f0871d1b","Type":"ContainerStarted","Data":"9ccee771bd187d8531d42941a3355d505fa5814e94a91abf1deb9409fb61a009"} Apr 21 15:10:54.872803 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:54.872763 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-68mrx" event={"ID":"93dbff63-b3ad-4508-8f23-3d4394458b3b","Type":"ContainerStarted","Data":"b0f41bf563852a8d7892f3ac553a9809ac168d920207c8f7fb01a1ec353475fd"} Apr 21 15:10:54.874220 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:54.874056 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2z5r" event={"ID":"2a989bd5-4d7c-4917-b441-576b61407d76","Type":"ContainerStarted","Data":"3d12c2915d6fc19a61be631ae426eaf3211092ce1619f45d9facdcdc1d178ad8"} Apr 21 15:10:54.875440 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:54.875419 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" event={"ID":"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c","Type":"ContainerStarted","Data":"089a7cebe17723f39b8fe1edb3033a81bffb9a355645236e28a06a05a06f93b7"} Apr 21 15:10:54.876854 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:54.876833 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g64wd" event={"ID":"572e9b4f-7cd2-40c1-9fe5-538bad3971a7","Type":"ContainerStarted","Data":"86c8ddad07e28305c43fd69f8dd0d1c223f8b5058a766f7921cb6f1059e217d8"} Apr 21 15:10:54.878451 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:54.878428 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" event={"ID":"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47","Type":"ContainerStarted","Data":"46b8a92901c4b9612c5049ea60e753bb704e2b5d719c6d7bed050f54b232e4e2"} Apr 21 15:10:54.878541 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:54.878458 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" event={"ID":"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47","Type":"ContainerStarted","Data":"803ab995035108b0676f5bf22f4c6147abf0aaf4691ef6112f5d9bdd14ad09ab"} Apr 21 15:10:54.879593 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:54.879573 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j58rf" event={"ID":"ecffb9c8-2d5e-409b-8013-126edd86ac8a","Type":"ContainerStarted","Data":"9e501a30e5ee543058e23adc0e6846f5fa9913babe11a8da40564d695e877a9f"} Apr 21 15:10:54.897524 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:54.897482 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-120.ec2.internal" podStartSLOduration=19.89746927 podStartE2EDuration="19.89746927s" podCreationTimestamp="2026-04-21 15:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:10:39.863167461 +0000 UTC m=+5.742939297" watchObservedRunningTime="2026-04-21 15:10:54.89746927 +0000 UTC m=+20.777241084" Apr 21 15:10:54.897926 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:54.897896 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bhzt8" podStartSLOduration=3.7534192600000003 podStartE2EDuration="20.897888787s" podCreationTimestamp="2026-04-21 15:10:34 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.385066558 +0000 UTC m=+3.264838360" lastFinishedPulling="2026-04-21 15:10:54.529536096 +0000 UTC m=+20.409307887" observedRunningTime="2026-04-21 15:10:54.897541764 +0000 UTC m=+20.777313579" watchObservedRunningTime="2026-04-21 15:10:54.897888787 +0000 UTC m=+20.777660601" Apr 21 15:10:54.920826 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:54.920789 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4zr9g" podStartSLOduration=11.947386452 podStartE2EDuration="20.920760778s" podCreationTimestamp="2026-04-21 15:10:34 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.384468857 +0000 UTC m=+3.264240649" lastFinishedPulling="2026-04-21 15:10:46.35784317 +0000 UTC m=+12.237614975" observedRunningTime="2026-04-21 15:10:54.920480927 +0000 UTC m=+20.800252740" watchObservedRunningTime="2026-04-21 15:10:54.920760778 +0000 UTC m=+20.800532590" Apr 21 15:10:54.946741 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:54.946702 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-g64wd" podStartSLOduration=3.796511936 podStartE2EDuration="20.946691996s" podCreationTimestamp="2026-04-21 15:10:34 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.379425118 +0000 UTC m=+3.259196925" lastFinishedPulling="2026-04-21 15:10:54.529605179 +0000 UTC m=+20.409376985" observedRunningTime="2026-04-21 15:10:54.946389421 +0000 UTC m=+20.826161234" watchObservedRunningTime="2026-04-21 15:10:54.946691996 +0000 UTC m=+20.826463808" Apr 21 15:10:55.025827 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:55.025765 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-68mrx" podStartSLOduration=3.876763307 podStartE2EDuration="21.025751186s" podCreationTimestamp="2026-04-21 15:10:34 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.382089332 +0000 UTC m=+3.261861128" lastFinishedPulling="2026-04-21 15:10:54.531077213 +0000 UTC m=+20.410849007" observedRunningTime="2026-04-21 15:10:55.025474895 +0000 UTC m=+20.905246707" watchObservedRunningTime="2026-04-21 15:10:55.025751186 +0000 UTC m=+20.905522999" Apr 21 15:10:55.026312 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:55.026290 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-j58rf" podStartSLOduration=3.868246971 podStartE2EDuration="21.026283244s" podCreationTimestamp="2026-04-21 15:10:34 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.375361808 +0000 UTC m=+3.255133604" lastFinishedPulling="2026-04-21 15:10:54.533398082 +0000 UTC m=+20.413169877" observedRunningTime="2026-04-21 15:10:55.004162937 +0000 UTC m=+20.883934750" watchObservedRunningTime="2026-04-21 15:10:55.026283244 +0000 UTC m=+20.906055112" Apr 21 15:10:55.728346 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:55.728326 2583 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 15:10:55.745048 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:55.745021 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:55.745129 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:55.745021 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:55.745163 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:55.745144 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:10:55.745208 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:55.745184 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:10:55.882466 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:55.882397 2583 generic.go:358] "Generic (PLEG): container finished" podID="2a989bd5-4d7c-4917-b441-576b61407d76" containerID="3d12c2915d6fc19a61be631ae426eaf3211092ce1619f45d9facdcdc1d178ad8" exitCode=0 Apr 21 15:10:55.882466 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:55.882461 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2z5r" event={"ID":"2a989bd5-4d7c-4917-b441-576b61407d76","Type":"ContainerDied","Data":"3d12c2915d6fc19a61be631ae426eaf3211092ce1619f45d9facdcdc1d178ad8"} Apr 21 15:10:55.884066 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:55.884045 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" event={"ID":"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c","Type":"ContainerStarted","Data":"a51cdd6c89a3d93e5e8aec79ded6af58ac62f7ad8c825e108661435174dbffaa"} Apr 21 15:10:55.888079 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:55.888055 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" event={"ID":"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47","Type":"ContainerStarted","Data":"075722b6ac0144e943beccadd7d496e724d68208fe5db0673971f4c20e5b0658"} Apr 21 15:10:55.888169 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:55.888089 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" event={"ID":"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47","Type":"ContainerStarted","Data":"d4101c7ce243c05c51a01093359c07c09768af8c15b92b15fa9528be02371659"} Apr 21 15:10:55.888169 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:55.888104 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" event={"ID":"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47","Type":"ContainerStarted","Data":"c437004a81dfdb6666f9a7da8bacdf2bdedc3f00624a1eee968ae48885333c80"} Apr 21 15:10:55.888169 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:55.888118 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" event={"ID":"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47","Type":"ContainerStarted","Data":"a85f61b9937bc715d98c1fdc58b69bce58ead9e541a0d723d5d415d7872fdb65"} Apr 21 15:10:56.708579 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:56.708435 2583 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T15:10:55.728343594Z","UUID":"c608dc75-d61d-4719-9a71-a048286201e8","Handler":null,"Name":"","Endpoint":""} Apr 21 15:10:56.710594 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:56.710563 2583 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 15:10:56.710594 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:56.710593 2583 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 15:10:56.744339 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:56.744309 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:56.744465 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:56.744429 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:10:56.890903 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:56.890865 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7n5n5" event={"ID":"9b3e45df-578e-4456-b850-310c9d4a72fa","Type":"ContainerStarted","Data":"3654839bafcbf0d2e82d64daa4ab2ff493789c986a0ce425a25c0584cc1b99de"} Apr 21 15:10:56.893086 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:56.893055 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" event={"ID":"db07dbb0-6d9b-4e68-b76f-a4fd4f68fb8c","Type":"ContainerStarted","Data":"3a0e1d9a698cb9941fc1513716a03d34ae628805b3f4bc9e7fbc2468784c5118"} Apr 21 15:10:56.908797 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:56.908745 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7n5n5" podStartSLOduration=5.753147896 podStartE2EDuration="22.908725731s" podCreationTimestamp="2026-04-21 15:10:34 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.373283534 +0000 UTC m=+3.253055326" lastFinishedPulling="2026-04-21 15:10:54.528861362 +0000 UTC m=+20.408633161" observedRunningTime="2026-04-21 15:10:56.908625657 +0000 UTC m=+22.788397481" watchObservedRunningTime="2026-04-21 15:10:56.908725731 +0000 UTC m=+22.788497543" Apr 21 15:10:56.934436 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:56.934397 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r69ph" podStartSLOduration=3.914707626 podStartE2EDuration="22.934381363s" podCreationTimestamp="2026-04-21 15:10:34 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.387358821 +0000 UTC m=+3.267130613" lastFinishedPulling="2026-04-21 15:10:56.407032558 +0000 UTC m=+22.286804350" observedRunningTime="2026-04-21 15:10:56.933699015 +0000 UTC m=+22.813470828" watchObservedRunningTime="2026-04-21 15:10:56.934381363 +0000 UTC m=+22.814153178" Apr 21 15:10:57.744982 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:57.744904 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:57.745146 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:57.744905 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:57.745146 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:57.745026 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:10:57.745146 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:57.745089 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:10:57.898251 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:57.898218 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" event={"ID":"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47","Type":"ContainerStarted","Data":"8c11c4c087138fc31573324461f6cee1f637457f2131dc6d945a613c42cda5ee"} Apr 21 15:10:58.744934 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:58.744904 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:10:58.745118 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:58.745033 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:10:59.744847 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:59.744762 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:10:59.745293 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:59.744762 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:10:59.745293 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:59.744887 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:10:59.745293 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:10:59.745002 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:10:59.783508 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:59.783472 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4zr9g" Apr 21 15:10:59.784290 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:10:59.784270 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4zr9g" Apr 21 15:11:00.744451 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:00.744271 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:11:00.744594 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:00.744525 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:11:00.907303 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:00.907270 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" event={"ID":"e46ae269-acf9-41f8-bfa2-1d7fd1b27c47","Type":"ContainerStarted","Data":"8a8603fab172665410627a37f23fd97dadd9bae061bfc2972e444f591245df05"} Apr 21 15:11:00.907818 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:00.907519 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:11:00.907818 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:00.907543 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:11:00.908970 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:00.908948 2583 generic.go:358] "Generic (PLEG): container finished" podID="2a989bd5-4d7c-4917-b441-576b61407d76" containerID="e45723af7e8a909f2efdbdc806fb6dc42f5d51d1ef7392507353ac42059f9311" exitCode=0 Apr 21 15:11:00.909064 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:00.908980 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2z5r" event={"ID":"2a989bd5-4d7c-4917-b441-576b61407d76","Type":"ContainerDied","Data":"e45723af7e8a909f2efdbdc806fb6dc42f5d51d1ef7392507353ac42059f9311"} Apr 21 15:11:00.922132 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:00.922106 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:11:00.956968 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:00.956928 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" podStartSLOduration=9.703727605 podStartE2EDuration="26.956916101s" podCreationTimestamp="2026-04-21 15:10:34 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.376392717 +0000 UTC m=+3.256164508" lastFinishedPulling="2026-04-21 15:10:54.629581207 +0000 UTC m=+20.509353004" observedRunningTime="2026-04-21 15:11:00.955292834 +0000 UTC m=+26.835064641" watchObservedRunningTime="2026-04-21 15:11:00.956916101 +0000 UTC m=+26.836687915" Apr 21 15:11:01.744853 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:01.744667 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:11:01.744936 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:01.744667 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:11:01.744982 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:01.744931 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:11:01.745014 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:01.744999 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:11:01.868137 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:01.868063 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bhk2h"] Apr 21 15:11:01.871302 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:01.871279 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wpmpv"] Apr 21 15:11:01.871429 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:01.871389 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:11:01.871505 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:01.871483 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:11:01.871818 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:01.871796 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-b8vgq"] Apr 21 15:11:01.913630 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:01.913605 2583 generic.go:358] "Generic (PLEG): container finished" podID="2a989bd5-4d7c-4917-b441-576b61407d76" containerID="f142f5bc344cd44259e2cf5a2a6937cf60bf7424fc14e90ee991807a62a136a6" exitCode=0 Apr 21 15:11:01.914016 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:01.913702 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2z5r" event={"ID":"2a989bd5-4d7c-4917-b441-576b61407d76","Type":"ContainerDied","Data":"f142f5bc344cd44259e2cf5a2a6937cf60bf7424fc14e90ee991807a62a136a6"} Apr 21 15:11:01.914016 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:01.913742 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:11:01.914016 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:01.913878 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:11:01.914734 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:01.914199 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:11:01.914734 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:01.914388 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:11:01.914734 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:01.914425 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:11:01.927883 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:01.927863 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:11:02.850028 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:02.849946 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4zr9g" Apr 21 15:11:02.850162 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:02.850096 2583 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 15:11:02.850621 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:02.850594 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4zr9g" Apr 21 15:11:02.917352 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:02.917323 2583 generic.go:358] "Generic (PLEG): container finished" podID="2a989bd5-4d7c-4917-b441-576b61407d76" containerID="a027c727ea647ca36a3777e056bf2b42174d3273d466f72bf6e504ecfb49a386" exitCode=0 Apr 21 15:11:02.917880 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:02.917394 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2z5r" event={"ID":"2a989bd5-4d7c-4917-b441-576b61407d76","Type":"ContainerDied","Data":"a027c727ea647ca36a3777e056bf2b42174d3273d466f72bf6e504ecfb49a386"} Apr 21 15:11:03.744556 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:03.744491 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:11:03.744689 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:03.744491 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:11:03.744689 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:03.744613 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:11:03.744829 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:03.744708 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:11:03.744829 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:03.744492 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:11:03.744917 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:03.744835 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:11:05.744218 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:05.744185 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:11:05.744977 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:05.744306 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:11:05.744977 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:05.744313 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:11:05.744977 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:05.744334 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:11:05.744977 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:05.744391 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhk2h" podUID="a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba" Apr 21 15:11:05.744977 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:05.744459 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b8vgq" podUID="c7fed088-3fe1-4c99-b3f4-37af1f9f317c" Apr 21 15:11:07.446588 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.446514 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-120.ec2.internal" event="NodeReady" Apr 21 15:11:07.447035 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.446642 2583 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 15:11:07.507313 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.507278 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pdvb8"] Apr 21 15:11:07.535932 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.535901 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rczsl"] Apr 21 15:11:07.536112 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.536093 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:07.538473 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.538452 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-76666\"" Apr 21 15:11:07.538589 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.538518 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 15:11:07.538736 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.538717 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 15:11:07.551509 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.551489 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pdvb8"] Apr 21 15:11:07.551618 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.551518 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rczsl"] Apr 21 15:11:07.551672 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.551626 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:11:07.555630 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.555613 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 15:11:07.556253 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.556216 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 15:11:07.556485 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.556461 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 15:11:07.557937 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.557839 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l7xgv\"" Apr 21 15:11:07.709417 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.709375 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:07.709417 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.709416 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql6k7\" (UniqueName: \"kubernetes.io/projected/db283e03-ba54-420d-b4b8-709634355db8-kube-api-access-ql6k7\") pod \"ingress-canary-rczsl\" (UID: \"db283e03-ba54-420d-b4b8-709634355db8\") " pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:11:07.709645 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.709459 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6626260d-dd76-4a21-87fd-b126ca3a6aac-tmp-dir\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:07.709645 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.709528 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pfvs\" (UniqueName: \"kubernetes.io/projected/6626260d-dd76-4a21-87fd-b126ca3a6aac-kube-api-access-7pfvs\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:07.709645 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.709557 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert\") pod \"ingress-canary-rczsl\" (UID: \"db283e03-ba54-420d-b4b8-709634355db8\") " pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:11:07.709803 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.709666 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6626260d-dd76-4a21-87fd-b126ca3a6aac-config-volume\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:07.744802 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.744751 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:11:07.744973 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.744808 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:11:07.744973 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.744900 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:11:07.747493 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.747442 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:11:07.747630 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.747518 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 15:11:07.747630 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.747541 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zqtzq\"" Apr 21 15:11:07.747741 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.747657 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 15:11:07.747741 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.747701 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4fs5d\"" Apr 21 15:11:07.747873 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.747813 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 15:11:07.811076 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.811039 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:07.811243 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.811084 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ql6k7\" (UniqueName: \"kubernetes.io/projected/db283e03-ba54-420d-b4b8-709634355db8-kube-api-access-ql6k7\") pod \"ingress-canary-rczsl\" (UID: \"db283e03-ba54-420d-b4b8-709634355db8\") " pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:11:07.811243 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.811115 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6626260d-dd76-4a21-87fd-b126ca3a6aac-tmp-dir\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:07.811243 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.811146 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pfvs\" (UniqueName: \"kubernetes.io/projected/6626260d-dd76-4a21-87fd-b126ca3a6aac-kube-api-access-7pfvs\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:07.811243 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.811169 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert\") pod \"ingress-canary-rczsl\" (UID: \"db283e03-ba54-420d-b4b8-709634355db8\") " pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:11:07.811243 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.811210 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6626260d-dd76-4a21-87fd-b126ca3a6aac-config-volume\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:07.811243 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:07.811224 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:07.811559 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:07.811312 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls podName:6626260d-dd76-4a21-87fd-b126ca3a6aac nodeName:}" failed. No retries permitted until 2026-04-21 15:11:08.31129468 +0000 UTC m=+34.191066474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls") pod "dns-default-pdvb8" (UID: "6626260d-dd76-4a21-87fd-b126ca3a6aac") : secret "dns-default-metrics-tls" not found Apr 21 15:11:07.811559 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:07.811413 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:07.811559 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:07.811477 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert podName:db283e03-ba54-420d-b4b8-709634355db8 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:08.311459147 +0000 UTC m=+34.191230962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert") pod "ingress-canary-rczsl" (UID: "db283e03-ba54-420d-b4b8-709634355db8") : secret "canary-serving-cert" not found Apr 21 15:11:07.811559 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.811512 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6626260d-dd76-4a21-87fd-b126ca3a6aac-tmp-dir\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:07.811857 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.811830 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6626260d-dd76-4a21-87fd-b126ca3a6aac-config-volume\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:07.823336 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.823165 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pfvs\" (UniqueName: \"kubernetes.io/projected/6626260d-dd76-4a21-87fd-b126ca3a6aac-kube-api-access-7pfvs\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:07.823469 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:07.823272 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql6k7\" (UniqueName: \"kubernetes.io/projected/db283e03-ba54-420d-b4b8-709634355db8-kube-api-access-ql6k7\") pod \"ingress-canary-rczsl\" (UID: \"db283e03-ba54-420d-b4b8-709634355db8\") " pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:11:08.314350 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.314317 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:08.314550 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.314362 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert\") pod \"ingress-canary-rczsl\" (UID: \"db283e03-ba54-420d-b4b8-709634355db8\") " pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:11:08.314550 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:08.314461 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:08.314550 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:08.314516 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls podName:6626260d-dd76-4a21-87fd-b126ca3a6aac nodeName:}" failed. No retries permitted until 2026-04-21 15:11:09.314502995 +0000 UTC m=+35.194274799 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls") pod "dns-default-pdvb8" (UID: "6626260d-dd76-4a21-87fd-b126ca3a6aac") : secret "dns-default-metrics-tls" not found Apr 21 15:11:08.314550 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:08.314461 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:08.314689 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:08.314602 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert podName:db283e03-ba54-420d-b4b8-709634355db8 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:09.314588717 +0000 UTC m=+35.194360513 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert") pod "ingress-canary-rczsl" (UID: "db283e03-ba54-420d-b4b8-709634355db8") : secret "canary-serving-cert" not found Apr 21 15:11:08.414868 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.414839 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret\") pod \"global-pull-secret-syncer-b8vgq\" (UID: \"c7fed088-3fe1-4c99-b3f4-37af1f9f317c\") " pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:11:08.415007 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.414875 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs\") pod \"network-metrics-daemon-wpmpv\" (UID: \"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad\") " pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:11:08.415007 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:08.414963 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 15:11:08.415085 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:08.415008 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs podName:3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad nodeName:}" failed. No retries permitted until 2026-04-21 15:11:40.414996428 +0000 UTC m=+66.294768218 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs") pod "network-metrics-daemon-wpmpv" (UID: "3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad") : secret "metrics-daemon-secret" not found Apr 21 15:11:08.417053 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.417024 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7fed088-3fe1-4c99-b3f4-37af1f9f317c-original-pull-secret\") pod \"global-pull-secret-syncer-b8vgq\" (UID: \"c7fed088-3fe1-4c99-b3f4-37af1f9f317c\") " pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:11:08.515635 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.515602 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46m8r\" (UniqueName: \"kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r\") pod \"network-check-target-bhk2h\" (UID: \"a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba\") " pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:11:08.518361 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.518341 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46m8r\" (UniqueName: \"kubernetes.io/projected/a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba-kube-api-access-46m8r\") pod \"network-check-target-bhk2h\" (UID: \"a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba\") " pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:11:08.604999 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.604942 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q"] Apr 21 15:11:08.627175 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.627154 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q"] Apr 21 15:11:08.627175 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.627179 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd"] Apr 21 15:11:08.627324 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.627295 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" Apr 21 15:11:08.630893 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.630870 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 15:11:08.630999 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.630943 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 15:11:08.630999 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.630971 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 15:11:08.631620 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.631600 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 15:11:08.640548 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.640532 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.642952 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.642931 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd"] Apr 21 15:11:08.643232 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.643215 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 15:11:08.643348 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.643217 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 15:11:08.643444 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.643425 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 15:11:08.643481 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.643465 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 15:11:08.657352 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.657332 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b8vgq" Apr 21 15:11:08.664024 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.664010 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:11:08.817954 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.817859 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f1f1992b-382d-499c-8974-94ab1d508626-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.817954 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.817942 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f1f1992b-382d-499c-8974-94ab1d508626-hub\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.818230 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.817976 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f1f1992b-382d-499c-8974-94ab1d508626-ca\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.818230 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.818001 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f1f1992b-382d-499c-8974-94ab1d508626-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.818230 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.818084 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d07b9de0-0135-4f26-8195-12adbe70dc9a-tmp\") pod \"klusterlet-addon-workmgr-5d447f6b5f-8x82q\" (UID: \"d07b9de0-0135-4f26-8195-12adbe70dc9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" Apr 21 15:11:08.818230 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.818162 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d07b9de0-0135-4f26-8195-12adbe70dc9a-klusterlet-config\") pod \"klusterlet-addon-workmgr-5d447f6b5f-8x82q\" (UID: \"d07b9de0-0135-4f26-8195-12adbe70dc9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" Apr 21 15:11:08.818230 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.818203 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzkf8\" (UniqueName: \"kubernetes.io/projected/d07b9de0-0135-4f26-8195-12adbe70dc9a-kube-api-access-vzkf8\") pod \"klusterlet-addon-workmgr-5d447f6b5f-8x82q\" (UID: \"d07b9de0-0135-4f26-8195-12adbe70dc9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" Apr 21 15:11:08.818475 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.818253 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f1f1992b-382d-499c-8974-94ab1d508626-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.818475 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.818280 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb6lq\" (UniqueName: \"kubernetes.io/projected/f1f1992b-382d-499c-8974-94ab1d508626-kube-api-access-zb6lq\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.872318 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.872292 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-b8vgq"] Apr 21 15:11:08.875247 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.875224 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bhk2h"] Apr 21 15:11:08.894138 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:11:08.894113 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7fed088_3fe1_4c99_b3f4_37af1f9f317c.slice/crio-1e5dad7f31fb9bee55dfc918686ca84c2432adddea2b495c5ebcf10d7e786a18 WatchSource:0}: Error finding container 1e5dad7f31fb9bee55dfc918686ca84c2432adddea2b495c5ebcf10d7e786a18: Status 404 returned error can't find the container with id 1e5dad7f31fb9bee55dfc918686ca84c2432adddea2b495c5ebcf10d7e786a18 Apr 21 15:11:08.894872 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:11:08.894849 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3c8ffdc_c1d9_4058_af4c_7ed9bd46d1ba.slice/crio-1354dac296f7c70a72d22a3a6b1d85cad61f87f6c01728ffaa0f012029096136 WatchSource:0}: Error finding container 1354dac296f7c70a72d22a3a6b1d85cad61f87f6c01728ffaa0f012029096136: Status 404 returned error can't find the container with id 1354dac296f7c70a72d22a3a6b1d85cad61f87f6c01728ffaa0f012029096136 Apr 21 15:11:08.919293 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.919271 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f1f1992b-382d-499c-8974-94ab1d508626-hub\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.919403 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.919300 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f1f1992b-382d-499c-8974-94ab1d508626-ca\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.919403 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.919318 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f1f1992b-382d-499c-8974-94ab1d508626-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.919403 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.919342 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d07b9de0-0135-4f26-8195-12adbe70dc9a-tmp\") pod \"klusterlet-addon-workmgr-5d447f6b5f-8x82q\" (UID: \"d07b9de0-0135-4f26-8195-12adbe70dc9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" Apr 21 15:11:08.919403 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.919360 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d07b9de0-0135-4f26-8195-12adbe70dc9a-klusterlet-config\") pod \"klusterlet-addon-workmgr-5d447f6b5f-8x82q\" (UID: \"d07b9de0-0135-4f26-8195-12adbe70dc9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" Apr 21 15:11:08.919763 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.919728 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d07b9de0-0135-4f26-8195-12adbe70dc9a-tmp\") pod \"klusterlet-addon-workmgr-5d447f6b5f-8x82q\" (UID: \"d07b9de0-0135-4f26-8195-12adbe70dc9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" Apr 21 15:11:08.919864 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.919823 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzkf8\" (UniqueName: \"kubernetes.io/projected/d07b9de0-0135-4f26-8195-12adbe70dc9a-kube-api-access-vzkf8\") pod \"klusterlet-addon-workmgr-5d447f6b5f-8x82q\" (UID: \"d07b9de0-0135-4f26-8195-12adbe70dc9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" Apr 21 15:11:08.919923 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.919868 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f1f1992b-382d-499c-8974-94ab1d508626-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.919923 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.919896 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zb6lq\" (UniqueName: \"kubernetes.io/projected/f1f1992b-382d-499c-8974-94ab1d508626-kube-api-access-zb6lq\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.920020 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.919979 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f1f1992b-382d-499c-8974-94ab1d508626-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.920711 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.920065 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f1f1992b-382d-499c-8974-94ab1d508626-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.923476 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.923452 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d07b9de0-0135-4f26-8195-12adbe70dc9a-klusterlet-config\") pod \"klusterlet-addon-workmgr-5d447f6b5f-8x82q\" (UID: \"d07b9de0-0135-4f26-8195-12adbe70dc9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" Apr 21 15:11:08.923557 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.923483 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f1f1992b-382d-499c-8974-94ab1d508626-ca\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.923557 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.923497 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f1f1992b-382d-499c-8974-94ab1d508626-hub\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.923557 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.923526 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f1f1992b-382d-499c-8974-94ab1d508626-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.923901 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.923884 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f1f1992b-382d-499c-8974-94ab1d508626-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.928093 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.928044 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzkf8\" (UniqueName: \"kubernetes.io/projected/d07b9de0-0135-4f26-8195-12adbe70dc9a-kube-api-access-vzkf8\") pod \"klusterlet-addon-workmgr-5d447f6b5f-8x82q\" (UID: \"d07b9de0-0135-4f26-8195-12adbe70dc9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" Apr 21 15:11:08.928269 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.928245 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb6lq\" (UniqueName: \"kubernetes.io/projected/f1f1992b-382d-499c-8974-94ab1d508626-kube-api-access-zb6lq\") pod \"cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd\" (UID: \"f1f1992b-382d-499c-8974-94ab1d508626\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:08.931917 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.931890 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2z5r" event={"ID":"2a989bd5-4d7c-4917-b441-576b61407d76","Type":"ContainerStarted","Data":"92c876602d478c0d1c940dcc835ad75fea3137ca4e47f359eaf3752afe115964"} Apr 21 15:11:08.933267 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.933229 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bhk2h" event={"ID":"a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba","Type":"ContainerStarted","Data":"1354dac296f7c70a72d22a3a6b1d85cad61f87f6c01728ffaa0f012029096136"} Apr 21 15:11:08.934373 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.934352 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-b8vgq" event={"ID":"c7fed088-3fe1-4c99-b3f4-37af1f9f317c","Type":"ContainerStarted","Data":"1e5dad7f31fb9bee55dfc918686ca84c2432adddea2b495c5ebcf10d7e786a18"} Apr 21 15:11:08.936575 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.936551 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" Apr 21 15:11:08.956605 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:08.956582 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:11:09.067580 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:09.067499 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q"] Apr 21 15:11:09.087479 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:09.087452 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd"] Apr 21 15:11:09.104985 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:11:09.104954 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd07b9de0_0135_4f26_8195_12adbe70dc9a.slice/crio-1d642d32b07737531d2f0101ad19001985678e477d041f30472fe985ab387949 WatchSource:0}: Error finding container 1d642d32b07737531d2f0101ad19001985678e477d041f30472fe985ab387949: Status 404 returned error can't find the container with id 1d642d32b07737531d2f0101ad19001985678e477d041f30472fe985ab387949 Apr 21 15:11:09.105236 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:11:09.105211 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1f1992b_382d_499c_8974_94ab1d508626.slice/crio-dd51f1d267f17cfa9331b017c0e1a58bcec6a14cfddf0543028cb82c823fe5fb WatchSource:0}: Error finding container dd51f1d267f17cfa9331b017c0e1a58bcec6a14cfddf0543028cb82c823fe5fb: Status 404 returned error can't find the container with id dd51f1d267f17cfa9331b017c0e1a58bcec6a14cfddf0543028cb82c823fe5fb Apr 21 15:11:09.324911 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:09.324879 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:09.325084 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:09.324918 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert\") pod \"ingress-canary-rczsl\" (UID: \"db283e03-ba54-420d-b4b8-709634355db8\") " pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:11:09.325084 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:09.325015 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:09.325084 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:09.325063 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert podName:db283e03-ba54-420d-b4b8-709634355db8 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:11.325050851 +0000 UTC m=+37.204822642 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert") pod "ingress-canary-rczsl" (UID: "db283e03-ba54-420d-b4b8-709634355db8") : secret "canary-serving-cert" not found Apr 21 15:11:09.325202 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:09.325013 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:09.325202 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:09.325155 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls podName:6626260d-dd76-4a21-87fd-b126ca3a6aac nodeName:}" failed. No retries permitted until 2026-04-21 15:11:11.325139984 +0000 UTC m=+37.204911776 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls") pod "dns-default-pdvb8" (UID: "6626260d-dd76-4a21-87fd-b126ca3a6aac") : secret "dns-default-metrics-tls" not found Apr 21 15:11:09.939860 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:09.939675 2583 generic.go:358] "Generic (PLEG): container finished" podID="2a989bd5-4d7c-4917-b441-576b61407d76" containerID="92c876602d478c0d1c940dcc835ad75fea3137ca4e47f359eaf3752afe115964" exitCode=0 Apr 21 15:11:09.939860 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:09.939764 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2z5r" event={"ID":"2a989bd5-4d7c-4917-b441-576b61407d76","Type":"ContainerDied","Data":"92c876602d478c0d1c940dcc835ad75fea3137ca4e47f359eaf3752afe115964"} Apr 21 15:11:09.941507 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:09.941482 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" event={"ID":"f1f1992b-382d-499c-8974-94ab1d508626","Type":"ContainerStarted","Data":"dd51f1d267f17cfa9331b017c0e1a58bcec6a14cfddf0543028cb82c823fe5fb"} Apr 21 15:11:09.944008 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:09.943963 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" event={"ID":"d07b9de0-0135-4f26-8195-12adbe70dc9a","Type":"ContainerStarted","Data":"1d642d32b07737531d2f0101ad19001985678e477d041f30472fe985ab387949"} Apr 21 15:11:10.954935 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:10.954803 2583 generic.go:358] "Generic (PLEG): container finished" podID="2a989bd5-4d7c-4917-b441-576b61407d76" containerID="edf31ebe041146a606d5eafbb17daa7b8568f5f7102e40412b244677ec73cc18" exitCode=0 Apr 21 15:11:10.954935 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:10.954886 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2z5r" event={"ID":"2a989bd5-4d7c-4917-b441-576b61407d76","Type":"ContainerDied","Data":"edf31ebe041146a606d5eafbb17daa7b8568f5f7102e40412b244677ec73cc18"} Apr 21 15:11:11.343606 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:11.342888 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:11.343606 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:11.342941 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert\") pod \"ingress-canary-rczsl\" (UID: \"db283e03-ba54-420d-b4b8-709634355db8\") " pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:11:11.343606 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:11.343055 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:11.343606 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:11.343111 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert podName:db283e03-ba54-420d-b4b8-709634355db8 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:15.343093909 +0000 UTC m=+41.222865707 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert") pod "ingress-canary-rczsl" (UID: "db283e03-ba54-420d-b4b8-709634355db8") : secret "canary-serving-cert" not found Apr 21 15:11:11.343606 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:11.343491 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:11.343606 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:11.343534 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls podName:6626260d-dd76-4a21-87fd-b126ca3a6aac nodeName:}" failed. No retries permitted until 2026-04-21 15:11:15.343520418 +0000 UTC m=+41.223292214 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls") pod "dns-default-pdvb8" (UID: "6626260d-dd76-4a21-87fd-b126ca3a6aac") : secret "dns-default-metrics-tls" not found Apr 21 15:11:11.961749 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:11.961702 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2z5r" event={"ID":"2a989bd5-4d7c-4917-b441-576b61407d76","Type":"ContainerStarted","Data":"5bf07332954c455f6ca00615f2c9808fd89a7ecc9e0850e93fd5e7872f88bb4d"} Apr 21 15:11:14.777582 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:14.777525 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g2z5r" podStartSLOduration=9.422031271 podStartE2EDuration="40.777507389s" podCreationTimestamp="2026-04-21 15:10:34 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.378938889 +0000 UTC m=+3.258710682" lastFinishedPulling="2026-04-21 15:11:08.734415008 +0000 UTC m=+34.614186800" observedRunningTime="2026-04-21 15:11:11.993900753 +0000 UTC m=+37.873672588" watchObservedRunningTime="2026-04-21 15:11:14.777507389 +0000 UTC m=+40.657279197" Apr 21 15:11:15.372786 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:15.372741 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:15.372997 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:15.372813 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert\") pod \"ingress-canary-rczsl\" (UID: \"db283e03-ba54-420d-b4b8-709634355db8\") " pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:11:15.372997 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:15.372895 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:15.372997 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:15.372965 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls podName:6626260d-dd76-4a21-87fd-b126ca3a6aac nodeName:}" failed. No retries permitted until 2026-04-21 15:11:23.372944918 +0000 UTC m=+49.252716714 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls") pod "dns-default-pdvb8" (UID: "6626260d-dd76-4a21-87fd-b126ca3a6aac") : secret "dns-default-metrics-tls" not found Apr 21 15:11:15.372997 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:15.372895 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:15.373210 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:15.373065 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert podName:db283e03-ba54-420d-b4b8-709634355db8 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:23.373041725 +0000 UTC m=+49.252813525 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert") pod "ingress-canary-rczsl" (UID: "db283e03-ba54-420d-b4b8-709634355db8") : secret "canary-serving-cert" not found Apr 21 15:11:18.976496 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:18.976460 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" event={"ID":"f1f1992b-382d-499c-8974-94ab1d508626","Type":"ContainerStarted","Data":"698267f4e5362591944680a6e4e34e094f8512bb0ea0d03e1150e83a39d4f702"} Apr 21 15:11:18.977650 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:18.977621 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-b8vgq" event={"ID":"c7fed088-3fe1-4c99-b3f4-37af1f9f317c","Type":"ContainerStarted","Data":"9afb15704513ffb0728a795247f3e5af852ede39299bf13490c43088ad805c29"} Apr 21 15:11:18.978863 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:18.978839 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" event={"ID":"d07b9de0-0135-4f26-8195-12adbe70dc9a","Type":"ContainerStarted","Data":"d344c8c88ac03bcd11e625d26b270861c0799fb18e3fb765da3432d09a902d4b"} Apr 21 15:11:18.979023 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:18.979009 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" Apr 21 15:11:18.980127 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:18.980093 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bhk2h" event={"ID":"a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba","Type":"ContainerStarted","Data":"5ba1b973df493d9d3bd6cb9c6e6f740193a3e790ead41c414b1afc5851465930"} Apr 21 15:11:18.980255 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:18.980237 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:11:18.980741 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:18.980726 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" Apr 21 15:11:18.993298 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:18.993260 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-b8vgq" podStartSLOduration=33.894688982 podStartE2EDuration="42.993250158s" podCreationTimestamp="2026-04-21 15:10:36 +0000 UTC" firstStartedPulling="2026-04-21 15:11:08.896008059 +0000 UTC m=+34.775779850" lastFinishedPulling="2026-04-21 15:11:17.994569235 +0000 UTC m=+43.874341026" observedRunningTime="2026-04-21 15:11:18.992555301 +0000 UTC m=+44.872327115" watchObservedRunningTime="2026-04-21 15:11:18.993250158 +0000 UTC m=+44.873021970" Apr 21 15:11:19.008091 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:19.008051 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" podStartSLOduration=2.105660767 podStartE2EDuration="11.008037439s" podCreationTimestamp="2026-04-21 15:11:08 +0000 UTC" firstStartedPulling="2026-04-21 15:11:09.106533826 +0000 UTC m=+34.986305618" lastFinishedPulling="2026-04-21 15:11:18.008910499 +0000 UTC m=+43.888682290" observedRunningTime="2026-04-21 15:11:19.007568283 +0000 UTC m=+44.887340096" watchObservedRunningTime="2026-04-21 15:11:19.008037439 +0000 UTC m=+44.887809252" Apr 21 15:11:19.025416 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:19.025376 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bhk2h" podStartSLOduration=35.927893815 podStartE2EDuration="45.025366318s" podCreationTimestamp="2026-04-21 15:10:34 +0000 UTC" firstStartedPulling="2026-04-21 15:11:08.897024554 +0000 UTC m=+34.776796352" lastFinishedPulling="2026-04-21 15:11:17.994497051 +0000 UTC m=+43.874268855" observedRunningTime="2026-04-21 15:11:19.024490556 +0000 UTC m=+44.904262369" watchObservedRunningTime="2026-04-21 15:11:19.025366318 +0000 UTC m=+44.905138130" Apr 21 15:11:23.425214 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:23.425175 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:23.425214 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:23.425220 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert\") pod \"ingress-canary-rczsl\" (UID: \"db283e03-ba54-420d-b4b8-709634355db8\") " pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:11:23.425710 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:23.425322 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:23.425710 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:23.425326 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:23.425710 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:23.425372 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert podName:db283e03-ba54-420d-b4b8-709634355db8 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:39.425359498 +0000 UTC m=+65.305131292 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert") pod "ingress-canary-rczsl" (UID: "db283e03-ba54-420d-b4b8-709634355db8") : secret "canary-serving-cert" not found Apr 21 15:11:23.425710 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:23.425385 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls podName:6626260d-dd76-4a21-87fd-b126ca3a6aac nodeName:}" failed. No retries permitted until 2026-04-21 15:11:39.425379316 +0000 UTC m=+65.305151107 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls") pod "dns-default-pdvb8" (UID: "6626260d-dd76-4a21-87fd-b126ca3a6aac") : secret "dns-default-metrics-tls" not found Apr 21 15:11:26.997338 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:26.997311 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" event={"ID":"f1f1992b-382d-499c-8974-94ab1d508626","Type":"ContainerStarted","Data":"92d913c611b7ad7f8a2e63b21fee556a90977a932224672bb2303df821c45544"} Apr 21 15:11:28.001751 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:28.001711 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" event={"ID":"f1f1992b-382d-499c-8974-94ab1d508626","Type":"ContainerStarted","Data":"421d98eaf8e8495cd00c532e740eb82a5586e583070e4e81801675c0022f07da"} Apr 21 15:11:28.020265 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:28.020144 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" podStartSLOduration=2.226588213 podStartE2EDuration="20.020129273s" podCreationTimestamp="2026-04-21 15:11:08 +0000 UTC" firstStartedPulling="2026-04-21 15:11:09.106608523 +0000 UTC m=+34.986380330" lastFinishedPulling="2026-04-21 15:11:26.900149595 +0000 UTC m=+52.779921390" observedRunningTime="2026-04-21 15:11:28.019438302 +0000 UTC m=+53.899210115" watchObservedRunningTime="2026-04-21 15:11:28.020129273 +0000 UTC m=+53.899901089" Apr 21 15:11:33.931108 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:33.931072 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bf2p7" Apr 21 15:11:39.426826 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:39.426767 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:11:39.426826 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:39.426830 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert\") pod \"ingress-canary-rczsl\" (UID: \"db283e03-ba54-420d-b4b8-709634355db8\") " pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:11:39.427303 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:39.426920 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:39.427303 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:39.426923 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:39.427303 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:39.426969 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert podName:db283e03-ba54-420d-b4b8-709634355db8 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:11.42695593 +0000 UTC m=+97.306727721 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert") pod "ingress-canary-rczsl" (UID: "db283e03-ba54-420d-b4b8-709634355db8") : secret "canary-serving-cert" not found Apr 21 15:11:39.427303 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:39.426983 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls podName:6626260d-dd76-4a21-87fd-b126ca3a6aac nodeName:}" failed. No retries permitted until 2026-04-21 15:12:11.426975946 +0000 UTC m=+97.306747738 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls") pod "dns-default-pdvb8" (UID: "6626260d-dd76-4a21-87fd-b126ca3a6aac") : secret "dns-default-metrics-tls" not found Apr 21 15:11:40.434001 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:40.433968 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs\") pod \"network-metrics-daemon-wpmpv\" (UID: \"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad\") " pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:11:40.434374 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:40.434114 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 15:11:40.434374 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:11:40.434181 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs podName:3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad nodeName:}" failed. No retries permitted until 2026-04-21 15:12:44.434166112 +0000 UTC m=+130.313937903 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs") pod "network-metrics-daemon-wpmpv" (UID: "3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad") : secret "metrics-daemon-secret" not found Apr 21 15:11:49.985352 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:11:49.985322 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bhk2h" Apr 21 15:12:11.441011 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:12:11.440977 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:12:11.441378 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:12:11.441027 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert\") pod \"ingress-canary-rczsl\" (UID: \"db283e03-ba54-420d-b4b8-709634355db8\") " pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:12:11.441378 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:12:11.441133 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:12:11.441378 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:12:11.441195 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert podName:db283e03-ba54-420d-b4b8-709634355db8 nodeName:}" failed. No retries permitted until 2026-04-21 15:13:15.441181482 +0000 UTC m=+161.320953276 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert") pod "ingress-canary-rczsl" (UID: "db283e03-ba54-420d-b4b8-709634355db8") : secret "canary-serving-cert" not found Apr 21 15:12:11.441378 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:12:11.441133 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:12:11.441378 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:12:11.441265 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls podName:6626260d-dd76-4a21-87fd-b126ca3a6aac nodeName:}" failed. No retries permitted until 2026-04-21 15:13:15.441253159 +0000 UTC m=+161.321024950 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls") pod "dns-default-pdvb8" (UID: "6626260d-dd76-4a21-87fd-b126ca3a6aac") : secret "dns-default-metrics-tls" not found Apr 21 15:12:44.458246 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:12:44.458207 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs\") pod \"network-metrics-daemon-wpmpv\" (UID: \"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad\") " pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:12:44.458920 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:12:44.458388 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 15:12:44.458920 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:12:44.458487 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs podName:3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad nodeName:}" failed. No retries permitted until 2026-04-21 15:14:46.458464814 +0000 UTC m=+252.338236610 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs") pod "network-metrics-daemon-wpmpv" (UID: "3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad") : secret "metrics-daemon-secret" not found Apr 21 15:12:47.876519 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:12:47.876490 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-g64wd_572e9b4f-7cd2-40c1-9fe5-538bad3971a7/dns-node-resolver/0.log" Apr 21 15:12:49.288926 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:12:49.288902 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bhzt8_f83ccf86-3867-4183-a1d2-d8fb6871e584/node-ca/0.log" Apr 21 15:13:10.548559 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:13:10.548507 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-pdvb8" podUID="6626260d-dd76-4a21-87fd-b126ca3a6aac" Apr 21 15:13:10.561413 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:13:10.561383 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rczsl" podUID="db283e03-ba54-420d-b4b8-709634355db8" Apr 21 15:13:10.769726 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:13:10.769662 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-wpmpv" podUID="3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad" Apr 21 15:13:11.233790 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:11.233749 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pdvb8" Apr 21 15:13:12.852236 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:12.852204 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8jpdv"] Apr 21 15:13:12.855338 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:12.855317 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:12.858874 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:12.858843 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 15:13:12.859521 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:12.859506 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 15:13:12.860095 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:12.860080 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5pq58\"" Apr 21 15:13:12.860158 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:12.860144 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 15:13:12.864765 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:12.864751 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 15:13:12.874170 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:12.874144 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8jpdv"] Apr 21 15:13:12.969924 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:12.969892 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrh6b\" (UniqueName: \"kubernetes.io/projected/c18cbd9b-856a-4074-9e0f-171624debb4f-kube-api-access-zrh6b\") pod \"insights-runtime-extractor-8jpdv\" (UID: \"c18cbd9b-856a-4074-9e0f-171624debb4f\") " pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:12.970085 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:12.969952 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c18cbd9b-856a-4074-9e0f-171624debb4f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8jpdv\" (UID: \"c18cbd9b-856a-4074-9e0f-171624debb4f\") " pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:12.970085 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:12.969990 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c18cbd9b-856a-4074-9e0f-171624debb4f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8jpdv\" (UID: \"c18cbd9b-856a-4074-9e0f-171624debb4f\") " pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:12.970085 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:12.970015 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c18cbd9b-856a-4074-9e0f-171624debb4f-data-volume\") pod \"insights-runtime-extractor-8jpdv\" (UID: \"c18cbd9b-856a-4074-9e0f-171624debb4f\") " pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:12.970085 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:12.970039 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c18cbd9b-856a-4074-9e0f-171624debb4f-crio-socket\") pod \"insights-runtime-extractor-8jpdv\" (UID: \"c18cbd9b-856a-4074-9e0f-171624debb4f\") " pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:13.023038 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.023004 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7968f9c975-jjfvf"] Apr 21 15:13:13.025729 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.025712 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.030529 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.030510 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gcqst\"" Apr 21 15:13:13.030883 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.030848 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 15:13:13.030883 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.030867 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 15:13:13.030883 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.030872 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 15:13:13.037449 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.037429 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 15:13:13.044924 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.044906 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7968f9c975-jjfvf"] Apr 21 15:13:13.070512 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.070494 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c18cbd9b-856a-4074-9e0f-171624debb4f-crio-socket\") pod \"insights-runtime-extractor-8jpdv\" (UID: \"c18cbd9b-856a-4074-9e0f-171624debb4f\") " pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:13.070606 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.070542 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrh6b\" (UniqueName: \"kubernetes.io/projected/c18cbd9b-856a-4074-9e0f-171624debb4f-kube-api-access-zrh6b\") pod \"insights-runtime-extractor-8jpdv\" (UID: \"c18cbd9b-856a-4074-9e0f-171624debb4f\") " pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:13.070606 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.070587 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c18cbd9b-856a-4074-9e0f-171624debb4f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8jpdv\" (UID: \"c18cbd9b-856a-4074-9e0f-171624debb4f\") " pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:13.070682 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.070613 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c18cbd9b-856a-4074-9e0f-171624debb4f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8jpdv\" (UID: \"c18cbd9b-856a-4074-9e0f-171624debb4f\") " pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:13.070682 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.070614 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c18cbd9b-856a-4074-9e0f-171624debb4f-crio-socket\") pod \"insights-runtime-extractor-8jpdv\" (UID: \"c18cbd9b-856a-4074-9e0f-171624debb4f\") " pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:13.070682 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.070642 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c18cbd9b-856a-4074-9e0f-171624debb4f-data-volume\") pod \"insights-runtime-extractor-8jpdv\" (UID: \"c18cbd9b-856a-4074-9e0f-171624debb4f\") " pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:13.070944 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.070931 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c18cbd9b-856a-4074-9e0f-171624debb4f-data-volume\") pod \"insights-runtime-extractor-8jpdv\" (UID: \"c18cbd9b-856a-4074-9e0f-171624debb4f\") " pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:13.071204 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.071186 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c18cbd9b-856a-4074-9e0f-171624debb4f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8jpdv\" (UID: \"c18cbd9b-856a-4074-9e0f-171624debb4f\") " pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:13.073281 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.073265 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c18cbd9b-856a-4074-9e0f-171624debb4f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8jpdv\" (UID: \"c18cbd9b-856a-4074-9e0f-171624debb4f\") " pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:13.094974 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.094956 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrh6b\" (UniqueName: \"kubernetes.io/projected/c18cbd9b-856a-4074-9e0f-171624debb4f-kube-api-access-zrh6b\") pod \"insights-runtime-extractor-8jpdv\" (UID: \"c18cbd9b-856a-4074-9e0f-171624debb4f\") " pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:13.165005 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.164938 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8jpdv" Apr 21 15:13:13.171734 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.171712 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53446b04-0fba-43df-ac3d-640ef2ec654d-registry-tls\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.171839 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.171752 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53446b04-0fba-43df-ac3d-640ef2ec654d-trusted-ca\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.171893 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.171839 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmrjl\" (UniqueName: \"kubernetes.io/projected/53446b04-0fba-43df-ac3d-640ef2ec654d-kube-api-access-jmrjl\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.171945 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.171924 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/53446b04-0fba-43df-ac3d-640ef2ec654d-image-registry-private-configuration\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.172026 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.172009 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53446b04-0fba-43df-ac3d-640ef2ec654d-bound-sa-token\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.172073 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.172039 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53446b04-0fba-43df-ac3d-640ef2ec654d-ca-trust-extracted\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.172073 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.172058 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53446b04-0fba-43df-ac3d-640ef2ec654d-registry-certificates\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.172146 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.172075 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53446b04-0fba-43df-ac3d-640ef2ec654d-installation-pull-secrets\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.273067 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.273037 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/53446b04-0fba-43df-ac3d-640ef2ec654d-image-registry-private-configuration\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.273219 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.273093 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53446b04-0fba-43df-ac3d-640ef2ec654d-bound-sa-token\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.273266 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.273222 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53446b04-0fba-43df-ac3d-640ef2ec654d-ca-trust-extracted\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.273317 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.273262 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53446b04-0fba-43df-ac3d-640ef2ec654d-registry-certificates\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.273368 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.273322 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53446b04-0fba-43df-ac3d-640ef2ec654d-installation-pull-secrets\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.273435 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.273420 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53446b04-0fba-43df-ac3d-640ef2ec654d-registry-tls\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.273490 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.273462 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53446b04-0fba-43df-ac3d-640ef2ec654d-trusted-ca\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.273539 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.273503 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmrjl\" (UniqueName: \"kubernetes.io/projected/53446b04-0fba-43df-ac3d-640ef2ec654d-kube-api-access-jmrjl\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.273755 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.273685 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53446b04-0fba-43df-ac3d-640ef2ec654d-ca-trust-extracted\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.274125 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.274104 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53446b04-0fba-43df-ac3d-640ef2ec654d-registry-certificates\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.274832 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.274813 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53446b04-0fba-43df-ac3d-640ef2ec654d-trusted-ca\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.275523 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.275501 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/53446b04-0fba-43df-ac3d-640ef2ec654d-image-registry-private-configuration\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.276015 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.275994 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53446b04-0fba-43df-ac3d-640ef2ec654d-registry-tls\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.276579 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.276558 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53446b04-0fba-43df-ac3d-640ef2ec654d-installation-pull-secrets\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.277647 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.277628 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8jpdv"] Apr 21 15:13:13.280452 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:13:13.280425 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc18cbd9b_856a_4074_9e0f_171624debb4f.slice/crio-079a2f6d552d4de27ffc2a0aebb34f36c9788bcc705b88d1e2b5baba1da5d683 WatchSource:0}: Error finding container 079a2f6d552d4de27ffc2a0aebb34f36c9788bcc705b88d1e2b5baba1da5d683: Status 404 returned error can't find the container with id 079a2f6d552d4de27ffc2a0aebb34f36c9788bcc705b88d1e2b5baba1da5d683 Apr 21 15:13:13.282946 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.282925 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmrjl\" (UniqueName: \"kubernetes.io/projected/53446b04-0fba-43df-ac3d-640ef2ec654d-kube-api-access-jmrjl\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.283017 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.282942 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53446b04-0fba-43df-ac3d-640ef2ec654d-bound-sa-token\") pod \"image-registry-7968f9c975-jjfvf\" (UID: \"53446b04-0fba-43df-ac3d-640ef2ec654d\") " pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.334057 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.334035 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:13.453076 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:13.453045 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7968f9c975-jjfvf"] Apr 21 15:13:13.456510 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:13:13.456487 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53446b04_0fba_43df_ac3d_640ef2ec654d.slice/crio-ca2853616dd4b462ee791a98d63a60542c9406474220ef3dec06210d4175f689 WatchSource:0}: Error finding container ca2853616dd4b462ee791a98d63a60542c9406474220ef3dec06210d4175f689: Status 404 returned error can't find the container with id ca2853616dd4b462ee791a98d63a60542c9406474220ef3dec06210d4175f689 Apr 21 15:13:14.243076 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:14.243041 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8jpdv" event={"ID":"c18cbd9b-856a-4074-9e0f-171624debb4f","Type":"ContainerStarted","Data":"87b633f01248d92cec9190d7b940476d590680fa8378b72aacbaa91580a78108"} Apr 21 15:13:14.243076 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:14.243080 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8jpdv" event={"ID":"c18cbd9b-856a-4074-9e0f-171624debb4f","Type":"ContainerStarted","Data":"079a2f6d552d4de27ffc2a0aebb34f36c9788bcc705b88d1e2b5baba1da5d683"} Apr 21 15:13:14.244257 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:14.244235 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" event={"ID":"53446b04-0fba-43df-ac3d-640ef2ec654d","Type":"ContainerStarted","Data":"7cccc7a3889aa3a6dd872f92726bbe0f2ccc2d68ccb8572abb4f7b5a42483e3d"} Apr 21 15:13:14.244336 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:14.244264 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" event={"ID":"53446b04-0fba-43df-ac3d-640ef2ec654d","Type":"ContainerStarted","Data":"ca2853616dd4b462ee791a98d63a60542c9406474220ef3dec06210d4175f689"} Apr 21 15:13:14.244399 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:14.244389 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:14.265818 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:14.265753 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" podStartSLOduration=2.265741382 podStartE2EDuration="2.265741382s" podCreationTimestamp="2026-04-21 15:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:13:14.264662615 +0000 UTC m=+160.144434451" watchObservedRunningTime="2026-04-21 15:13:14.265741382 +0000 UTC m=+160.145513189" Apr 21 15:13:15.249255 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:15.249214 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8jpdv" event={"ID":"c18cbd9b-856a-4074-9e0f-171624debb4f","Type":"ContainerStarted","Data":"cb9760236f94636ffc776df484d2ab07ebd6b98b02c46a259703329a4c627e58"} Apr 21 15:13:15.490275 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:15.490242 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:13:15.490459 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:15.490308 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert\") pod \"ingress-canary-rczsl\" (UID: \"db283e03-ba54-420d-b4b8-709634355db8\") " pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:13:15.492864 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:15.492837 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6626260d-dd76-4a21-87fd-b126ca3a6aac-metrics-tls\") pod \"dns-default-pdvb8\" (UID: \"6626260d-dd76-4a21-87fd-b126ca3a6aac\") " pod="openshift-dns/dns-default-pdvb8" Apr 21 15:13:15.492983 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:15.492913 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db283e03-ba54-420d-b4b8-709634355db8-cert\") pod \"ingress-canary-rczsl\" (UID: \"db283e03-ba54-420d-b4b8-709634355db8\") " pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:13:15.737020 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:15.736989 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-76666\"" Apr 21 15:13:15.745474 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:15.745445 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pdvb8" Apr 21 15:13:15.882431 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:15.882408 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pdvb8"] Apr 21 15:13:15.884944 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:13:15.884918 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6626260d_dd76_4a21_87fd_b126ca3a6aac.slice/crio-95b8419c96d08c81687d7024eefd17df3cf893a9a022d213bbbd9045c21f3fbb WatchSource:0}: Error finding container 95b8419c96d08c81687d7024eefd17df3cf893a9a022d213bbbd9045c21f3fbb: Status 404 returned error can't find the container with id 95b8419c96d08c81687d7024eefd17df3cf893a9a022d213bbbd9045c21f3fbb Apr 21 15:13:16.252485 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:16.252444 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pdvb8" event={"ID":"6626260d-dd76-4a21-87fd-b126ca3a6aac","Type":"ContainerStarted","Data":"95b8419c96d08c81687d7024eefd17df3cf893a9a022d213bbbd9045c21f3fbb"} Apr 21 15:13:16.254003 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:16.253980 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8jpdv" event={"ID":"c18cbd9b-856a-4074-9e0f-171624debb4f","Type":"ContainerStarted","Data":"c54e33581407641850ce074cbbb39ff0a1d2b623811d05036209dc4b60e37ea8"} Apr 21 15:13:16.291635 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:16.291593 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8jpdv" podStartSLOduration=1.8171105189999999 podStartE2EDuration="4.291579306s" podCreationTimestamp="2026-04-21 15:13:12 +0000 UTC" firstStartedPulling="2026-04-21 15:13:13.332630618 +0000 UTC m=+159.212402410" lastFinishedPulling="2026-04-21 15:13:15.807099392 +0000 UTC m=+161.686871197" observedRunningTime="2026-04-21 15:13:16.291109987 +0000 UTC m=+162.170881823" watchObservedRunningTime="2026-04-21 15:13:16.291579306 +0000 UTC m=+162.171351148" Apr 21 15:13:17.258589 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:17.258554 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pdvb8" event={"ID":"6626260d-dd76-4a21-87fd-b126ca3a6aac","Type":"ContainerStarted","Data":"b44fbbcdc3a284b417785b8eeb9a1d9a7b1d3857e663bbfde183dee5e4b932ab"} Apr 21 15:13:18.263317 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:18.263276 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pdvb8" event={"ID":"6626260d-dd76-4a21-87fd-b126ca3a6aac","Type":"ContainerStarted","Data":"f642e3bedcf702e3184ddfdc1cc5a44d6852b65d1d8953d2a6f155f7d7bce732"} Apr 21 15:13:18.263677 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:18.263410 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-pdvb8" Apr 21 15:13:18.281409 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:18.281343 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pdvb8" podStartSLOduration=130.030546627 podStartE2EDuration="2m11.281329539s" podCreationTimestamp="2026-04-21 15:11:07 +0000 UTC" firstStartedPulling="2026-04-21 15:13:15.886609741 +0000 UTC m=+161.766381536" lastFinishedPulling="2026-04-21 15:13:17.137392654 +0000 UTC m=+163.017164448" observedRunningTime="2026-04-21 15:13:18.280824709 +0000 UTC m=+164.160596523" watchObservedRunningTime="2026-04-21 15:13:18.281329539 +0000 UTC m=+164.161101354" Apr 21 15:13:18.937668 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:18.937606 2583 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" podUID="d07b9de0-0135-4f26-8195-12adbe70dc9a" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.8:8000/healthz\": dial tcp 10.133.0.8:8000: connect: connection refused" Apr 21 15:13:18.979924 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:18.979886 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" podUID="d07b9de0-0135-4f26-8195-12adbe70dc9a" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.8:8000/readyz\": dial tcp 10.133.0.8:8000: connect: connection refused" Apr 21 15:13:19.267757 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:19.267728 2583 generic.go:358] "Generic (PLEG): container finished" podID="d07b9de0-0135-4f26-8195-12adbe70dc9a" containerID="d344c8c88ac03bcd11e625d26b270861c0799fb18e3fb765da3432d09a902d4b" exitCode=1 Apr 21 15:13:19.268128 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:19.267808 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" event={"ID":"d07b9de0-0135-4f26-8195-12adbe70dc9a","Type":"ContainerDied","Data":"d344c8c88ac03bcd11e625d26b270861c0799fb18e3fb765da3432d09a902d4b"} Apr 21 15:13:19.268189 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:19.268175 2583 scope.go:117] "RemoveContainer" containerID="d344c8c88ac03bcd11e625d26b270861c0799fb18e3fb765da3432d09a902d4b" Apr 21 15:13:20.273098 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:20.273026 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" event={"ID":"d07b9de0-0135-4f26-8195-12adbe70dc9a","Type":"ContainerStarted","Data":"9e2c68840b802c4604d5582dfa9aa476af05816f233d860f7863a32f1de6479a"} Apr 21 15:13:20.273457 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:20.273313 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" Apr 21 15:13:20.274255 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:20.274237 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d447f6b5f-8x82q" Apr 21 15:13:23.744246 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:23.744203 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:13:24.746034 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:24.746003 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:13:24.748944 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:24.748923 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l7xgv\"" Apr 21 15:13:24.757009 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:24.756990 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rczsl" Apr 21 15:13:24.868197 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:24.868169 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rczsl"] Apr 21 15:13:24.871918 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:13:24.871887 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb283e03_ba54_420d_b4b8_709634355db8.slice/crio-41345c52c7969b8f4431cb1a77b7f9864cd0438388cf64d5ddca4c35bda830d6 WatchSource:0}: Error finding container 41345c52c7969b8f4431cb1a77b7f9864cd0438388cf64d5ddca4c35bda830d6: Status 404 returned error can't find the container with id 41345c52c7969b8f4431cb1a77b7f9864cd0438388cf64d5ddca4c35bda830d6 Apr 21 15:13:25.287699 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:25.287658 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rczsl" event={"ID":"db283e03-ba54-420d-b4b8-709634355db8","Type":"ContainerStarted","Data":"41345c52c7969b8f4431cb1a77b7f9864cd0438388cf64d5ddca4c35bda830d6"} Apr 21 15:13:27.294196 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:27.294162 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rczsl" event={"ID":"db283e03-ba54-420d-b4b8-709634355db8","Type":"ContainerStarted","Data":"f5907f2754d0616a5adbaef7554b7950f3e051111710a593e5be14936a9c69d5"} Apr 21 15:13:28.269959 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:28.269927 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pdvb8" Apr 21 15:13:28.287719 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:28.287675 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rczsl" podStartSLOduration=139.82232639 podStartE2EDuration="2m21.287662391s" podCreationTimestamp="2026-04-21 15:11:07 +0000 UTC" firstStartedPulling="2026-04-21 15:13:24.873793088 +0000 UTC m=+170.753564879" lastFinishedPulling="2026-04-21 15:13:26.339129085 +0000 UTC m=+172.218900880" observedRunningTime="2026-04-21 15:13:27.315280774 +0000 UTC m=+173.195052587" watchObservedRunningTime="2026-04-21 15:13:28.287662391 +0000 UTC m=+174.167434222" Apr 21 15:13:33.151707 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.151672 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg"] Apr 21 15:13:33.154899 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.154882 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" Apr 21 15:13:33.157437 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.157416 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 15:13:33.157559 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.157413 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 15:13:33.157949 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.157932 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 15:13:33.157949 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.157940 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-j94tz\"" Apr 21 15:13:33.158043 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.157959 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 15:13:33.158614 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.158500 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 15:13:33.165127 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.165103 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg"] Apr 21 15:13:33.206392 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.206365 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-v42pl"] Apr 21 15:13:33.208657 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.208637 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b0e43122-ccd1-4865-8281-214200591357-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-c98jg\" (UID: \"b0e43122-ccd1-4865-8281-214200591357\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" Apr 21 15:13:33.208763 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.208671 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdl2t\" (UniqueName: \"kubernetes.io/projected/b0e43122-ccd1-4865-8281-214200591357-kube-api-access-bdl2t\") pod \"openshift-state-metrics-9d44df66c-c98jg\" (UID: \"b0e43122-ccd1-4865-8281-214200591357\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" Apr 21 15:13:33.208763 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.208705 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0e43122-ccd1-4865-8281-214200591357-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-c98jg\" (UID: \"b0e43122-ccd1-4865-8281-214200591357\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" Apr 21 15:13:33.208876 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.208795 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0e43122-ccd1-4865-8281-214200591357-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-c98jg\" (UID: \"b0e43122-ccd1-4865-8281-214200591357\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" Apr 21 15:13:33.209356 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.209343 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.211742 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.211721 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qhrt9\"" Apr 21 15:13:33.211868 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.211747 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 15:13:33.211936 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.211882 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 15:13:33.211936 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.211911 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 15:13:33.309149 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.309121 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0e43122-ccd1-4865-8281-214200591357-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-c98jg\" (UID: \"b0e43122-ccd1-4865-8281-214200591357\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" Apr 21 15:13:33.309320 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.309160 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68fd2d26-90ba-4d6a-af55-9bbf50606db8-sys\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.309320 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.309185 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68fd2d26-90ba-4d6a-af55-9bbf50606db8-metrics-client-ca\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.309320 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.309241 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0e43122-ccd1-4865-8281-214200591357-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-c98jg\" (UID: \"b0e43122-ccd1-4865-8281-214200591357\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" Apr 21 15:13:33.309441 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.309343 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fz25\" (UniqueName: \"kubernetes.io/projected/68fd2d26-90ba-4d6a-af55-9bbf50606db8-kube-api-access-7fz25\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.309441 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.309408 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-wtmp\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.309518 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.309456 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-accelerators-collector-config\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.309518 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.309486 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-textfile\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.309607 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.309588 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.309651 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.309621 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/68fd2d26-90ba-4d6a-af55-9bbf50606db8-root\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.309701 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.309680 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b0e43122-ccd1-4865-8281-214200591357-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-c98jg\" (UID: \"b0e43122-ccd1-4865-8281-214200591357\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" Apr 21 15:13:33.309750 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.309720 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdl2t\" (UniqueName: \"kubernetes.io/projected/b0e43122-ccd1-4865-8281-214200591357-kube-api-access-bdl2t\") pod \"openshift-state-metrics-9d44df66c-c98jg\" (UID: \"b0e43122-ccd1-4865-8281-214200591357\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" Apr 21 15:13:33.309820 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.309755 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-tls\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.310080 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.310056 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0e43122-ccd1-4865-8281-214200591357-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-c98jg\" (UID: \"b0e43122-ccd1-4865-8281-214200591357\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" Apr 21 15:13:33.311668 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.311648 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0e43122-ccd1-4865-8281-214200591357-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-c98jg\" (UID: \"b0e43122-ccd1-4865-8281-214200591357\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" Apr 21 15:13:33.311998 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.311976 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b0e43122-ccd1-4865-8281-214200591357-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-c98jg\" (UID: \"b0e43122-ccd1-4865-8281-214200591357\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" Apr 21 15:13:33.325756 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.325733 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdl2t\" (UniqueName: \"kubernetes.io/projected/b0e43122-ccd1-4865-8281-214200591357-kube-api-access-bdl2t\") pod \"openshift-state-metrics-9d44df66c-c98jg\" (UID: \"b0e43122-ccd1-4865-8281-214200591357\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" Apr 21 15:13:33.338649 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.338626 2583 patch_prober.go:28] interesting pod/image-registry-7968f9c975-jjfvf container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 15:13:33.338746 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.338705 2583 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" podUID="53446b04-0fba-43df-ac3d-640ef2ec654d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:13:33.410952 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.410871 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-tls\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.410952 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.410913 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68fd2d26-90ba-4d6a-af55-9bbf50606db8-sys\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.410952 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.410937 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68fd2d26-90ba-4d6a-af55-9bbf50606db8-metrics-client-ca\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.410952 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.410964 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fz25\" (UniqueName: \"kubernetes.io/projected/68fd2d26-90ba-4d6a-af55-9bbf50606db8-kube-api-access-7fz25\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.411261 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.410987 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-wtmp\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.411261 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.410987 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68fd2d26-90ba-4d6a-af55-9bbf50606db8-sys\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.411261 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:13:33.411026 2583 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 15:13:33.411261 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.411029 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-accelerators-collector-config\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.411261 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.411056 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-textfile\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.411261 ip-10-0-143-120 kubenswrapper[2583]: E0421 15:13:33.411094 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-tls podName:68fd2d26-90ba-4d6a-af55-9bbf50606db8 nodeName:}" failed. No retries permitted until 2026-04-21 15:13:33.911075425 +0000 UTC m=+179.790847218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-tls") pod "node-exporter-v42pl" (UID: "68fd2d26-90ba-4d6a-af55-9bbf50606db8") : secret "node-exporter-tls" not found Apr 21 15:13:33.411261 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.411130 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.411261 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.411158 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/68fd2d26-90ba-4d6a-af55-9bbf50606db8-root\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.411261 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.411161 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-wtmp\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.411261 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.411243 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/68fd2d26-90ba-4d6a-af55-9bbf50606db8-root\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.411636 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.411409 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-textfile\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.411636 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.411552 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68fd2d26-90ba-4d6a-af55-9bbf50606db8-metrics-client-ca\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.411636 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.411627 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-accelerators-collector-config\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.413392 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.413374 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.421113 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.421093 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fz25\" (UniqueName: \"kubernetes.io/projected/68fd2d26-90ba-4d6a-af55-9bbf50606db8-kube-api-access-7fz25\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.466004 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.465973 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" Apr 21 15:13:33.584513 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.584456 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg"] Apr 21 15:13:33.586908 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:13:33.586871 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0e43122_ccd1_4865_8281_214200591357.slice/crio-ecb61441232052b0dc612106f855742678ce035f52e356c6a5c2fbaa9e4cf35e WatchSource:0}: Error finding container ecb61441232052b0dc612106f855742678ce035f52e356c6a5c2fbaa9e4cf35e: Status 404 returned error can't find the container with id ecb61441232052b0dc612106f855742678ce035f52e356c6a5c2fbaa9e4cf35e Apr 21 15:13:33.914579 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.914545 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-tls\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:33.916740 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:33.916699 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/68fd2d26-90ba-4d6a-af55-9bbf50606db8-node-exporter-tls\") pod \"node-exporter-v42pl\" (UID: \"68fd2d26-90ba-4d6a-af55-9bbf50606db8\") " pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:34.117897 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:34.117860 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v42pl" Apr 21 15:13:34.125520 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:13:34.125484 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68fd2d26_90ba_4d6a_af55_9bbf50606db8.slice/crio-77b561c99b5ed22eac7938bd470071a3c56e0ce217191a6debe6096d17e5b030 WatchSource:0}: Error finding container 77b561c99b5ed22eac7938bd470071a3c56e0ce217191a6debe6096d17e5b030: Status 404 returned error can't find the container with id 77b561c99b5ed22eac7938bd470071a3c56e0ce217191a6debe6096d17e5b030 Apr 21 15:13:34.315585 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:34.315544 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" event={"ID":"b0e43122-ccd1-4865-8281-214200591357","Type":"ContainerStarted","Data":"e7784fcc2278bce2d45d7846bd4adf44b246f0cf083df64601dfd88f2ad33f87"} Apr 21 15:13:34.315585 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:34.315591 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" event={"ID":"b0e43122-ccd1-4865-8281-214200591357","Type":"ContainerStarted","Data":"c40c39af8ca5c699d0230d006d67da516812c226827603e302e4370bd34d6b7a"} Apr 21 15:13:34.316099 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:34.315604 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" event={"ID":"b0e43122-ccd1-4865-8281-214200591357","Type":"ContainerStarted","Data":"ecb61441232052b0dc612106f855742678ce035f52e356c6a5c2fbaa9e4cf35e"} Apr 21 15:13:34.316812 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:34.316760 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v42pl" event={"ID":"68fd2d26-90ba-4d6a-af55-9bbf50606db8","Type":"ContainerStarted","Data":"77b561c99b5ed22eac7938bd470071a3c56e0ce217191a6debe6096d17e5b030"} Apr 21 15:13:35.253402 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:35.253374 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7968f9c975-jjfvf" Apr 21 15:13:35.321610 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:35.321571 2583 generic.go:358] "Generic (PLEG): container finished" podID="68fd2d26-90ba-4d6a-af55-9bbf50606db8" containerID="b467f2062dd139f62e6855559253ba93303e94f866a97e28e87ea9c403303f70" exitCode=0 Apr 21 15:13:35.322075 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:35.321643 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v42pl" event={"ID":"68fd2d26-90ba-4d6a-af55-9bbf50606db8","Type":"ContainerDied","Data":"b467f2062dd139f62e6855559253ba93303e94f866a97e28e87ea9c403303f70"} Apr 21 15:13:35.323727 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:35.323703 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" event={"ID":"b0e43122-ccd1-4865-8281-214200591357","Type":"ContainerStarted","Data":"276e9f5332c540295f5eb291518a7823e08b7db6a2657376b29d3b9735879a82"} Apr 21 15:13:35.358263 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:35.358220 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c98jg" podStartSLOduration=1.158930674 podStartE2EDuration="2.358207454s" podCreationTimestamp="2026-04-21 15:13:33 +0000 UTC" firstStartedPulling="2026-04-21 15:13:33.715360501 +0000 UTC m=+179.595132299" lastFinishedPulling="2026-04-21 15:13:34.914637289 +0000 UTC m=+180.794409079" observedRunningTime="2026-04-21 15:13:35.357302933 +0000 UTC m=+181.237074745" watchObservedRunningTime="2026-04-21 15:13:35.358207454 +0000 UTC m=+181.237979267" Apr 21 15:13:36.328393 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:36.328359 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v42pl" event={"ID":"68fd2d26-90ba-4d6a-af55-9bbf50606db8","Type":"ContainerStarted","Data":"54c2b93dbbe8810619f8a65a7da0a9453f7b73ea46026e629a6163f1a433d443"} Apr 21 15:13:36.328393 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:36.328401 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v42pl" event={"ID":"68fd2d26-90ba-4d6a-af55-9bbf50606db8","Type":"ContainerStarted","Data":"4abdc25bc9bd8e61afce778cb1c319ee227de027518287042e118bb362a3b051"} Apr 21 15:13:36.349103 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:36.349064 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-v42pl" podStartSLOduration=2.5598928130000003 podStartE2EDuration="3.349050769s" podCreationTimestamp="2026-04-21 15:13:33 +0000 UTC" firstStartedPulling="2026-04-21 15:13:34.127128713 +0000 UTC m=+180.006900503" lastFinishedPulling="2026-04-21 15:13:34.916286664 +0000 UTC m=+180.796058459" observedRunningTime="2026-04-21 15:13:36.348696751 +0000 UTC m=+182.228468563" watchObservedRunningTime="2026-04-21 15:13:36.349050769 +0000 UTC m=+182.228822582" Apr 21 15:13:58.958657 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:13:58.958617 2583 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" podUID="f1f1992b-382d-499c-8974-94ab1d508626" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 15:14:08.957918 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:08.957872 2583 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" podUID="f1f1992b-382d-499c-8974-94ab1d508626" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 15:14:18.958293 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:18.958253 2583 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" podUID="f1f1992b-382d-499c-8974-94ab1d508626" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 15:14:18.958653 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:18.958347 2583 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" Apr 21 15:14:18.958828 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:18.958798 2583 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"421d98eaf8e8495cd00c532e740eb82a5586e583070e4e81801675c0022f07da"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 15:14:18.958875 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:18.958859 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" podUID="f1f1992b-382d-499c-8974-94ab1d508626" containerName="service-proxy" containerID="cri-o://421d98eaf8e8495cd00c532e740eb82a5586e583070e4e81801675c0022f07da" gracePeriod=30 Apr 21 15:14:19.433192 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:19.433103 2583 generic.go:358] "Generic (PLEG): container finished" podID="f1f1992b-382d-499c-8974-94ab1d508626" containerID="421d98eaf8e8495cd00c532e740eb82a5586e583070e4e81801675c0022f07da" exitCode=2 Apr 21 15:14:19.433192 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:19.433161 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" event={"ID":"f1f1992b-382d-499c-8974-94ab1d508626","Type":"ContainerDied","Data":"421d98eaf8e8495cd00c532e740eb82a5586e583070e4e81801675c0022f07da"} Apr 21 15:14:19.433365 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:19.433197 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ff7fc8b67-p2rcd" event={"ID":"f1f1992b-382d-499c-8974-94ab1d508626","Type":"ContainerStarted","Data":"f3125ca551eb95e5941dcd8e4f5dec231430e59bffaf08f96b78f386334b398e"} Apr 21 15:14:46.544932 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:46.544893 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs\") pod \"network-metrics-daemon-wpmpv\" (UID: \"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad\") " pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:14:46.547083 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:46.547061 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad-metrics-certs\") pod \"network-metrics-daemon-wpmpv\" (UID: \"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad\") " pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:14:46.847706 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:46.847624 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zqtzq\"" Apr 21 15:14:46.855260 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:46.855240 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wpmpv" Apr 21 15:14:46.978591 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:46.978557 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wpmpv"] Apr 21 15:14:46.982227 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:14:46.982193 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cc02d04_6c8a_4b9f_a09b_5bdf11c8e8ad.slice/crio-aed30a1ceb72a2f347fa6ecba048658bb4a0dfa3d615549225eb98b0f755cf11 WatchSource:0}: Error finding container aed30a1ceb72a2f347fa6ecba048658bb4a0dfa3d615549225eb98b0f755cf11: Status 404 returned error can't find the container with id aed30a1ceb72a2f347fa6ecba048658bb4a0dfa3d615549225eb98b0f755cf11 Apr 21 15:14:47.503630 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:47.503591 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wpmpv" event={"ID":"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad","Type":"ContainerStarted","Data":"aed30a1ceb72a2f347fa6ecba048658bb4a0dfa3d615549225eb98b0f755cf11"} Apr 21 15:14:48.508090 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:48.508048 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wpmpv" event={"ID":"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad","Type":"ContainerStarted","Data":"185c2b0f05d9bc619100a3c9c4e2105bd3a775c92229b39aa009cbf7fd71e724"} Apr 21 15:14:48.508090 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:48.508092 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wpmpv" event={"ID":"3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad","Type":"ContainerStarted","Data":"2df440b9daeeeff6657234916032c52006a1c428113fde59744f3252fdb4ac95"} Apr 21 15:14:48.524839 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:14:48.524788 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wpmpv" podStartSLOduration=253.584930341 podStartE2EDuration="4m14.524757758s" podCreationTimestamp="2026-04-21 15:10:34 +0000 UTC" firstStartedPulling="2026-04-21 15:14:46.983963386 +0000 UTC m=+252.863735177" lastFinishedPulling="2026-04-21 15:14:47.923790801 +0000 UTC m=+253.803562594" observedRunningTime="2026-04-21 15:14:48.524547465 +0000 UTC m=+254.404319275" watchObservedRunningTime="2026-04-21 15:14:48.524757758 +0000 UTC m=+254.404529571" Apr 21 15:15:34.665450 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:15:34.665420 2583 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 15:16:50.293204 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.293129 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-fkq8p"] Apr 21 15:16:50.296235 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.296219 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-fkq8p" Apr 21 15:16:50.300730 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.298752 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-zcx7j\"" Apr 21 15:16:50.300730 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.299210 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:16:50.300730 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.300439 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 21 15:16:50.309003 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.308982 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-fkq8p"] Apr 21 15:16:50.397444 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.397410 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gpgg\" (UniqueName: \"kubernetes.io/projected/4808c1f2-e2ab-4d58-a329-b1d4007917a4-kube-api-access-7gpgg\") pod \"jobset-operator-747c5859c7-fkq8p\" (UID: \"4808c1f2-e2ab-4d58-a329-b1d4007917a4\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-fkq8p" Apr 21 15:16:50.397611 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.397484 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4808c1f2-e2ab-4d58-a329-b1d4007917a4-tmp\") pod \"jobset-operator-747c5859c7-fkq8p\" (UID: \"4808c1f2-e2ab-4d58-a329-b1d4007917a4\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-fkq8p" Apr 21 15:16:50.497910 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.497881 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gpgg\" (UniqueName: \"kubernetes.io/projected/4808c1f2-e2ab-4d58-a329-b1d4007917a4-kube-api-access-7gpgg\") pod \"jobset-operator-747c5859c7-fkq8p\" (UID: \"4808c1f2-e2ab-4d58-a329-b1d4007917a4\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-fkq8p" Apr 21 15:16:50.498044 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.497931 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4808c1f2-e2ab-4d58-a329-b1d4007917a4-tmp\") pod \"jobset-operator-747c5859c7-fkq8p\" (UID: \"4808c1f2-e2ab-4d58-a329-b1d4007917a4\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-fkq8p" Apr 21 15:16:50.498228 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.498214 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4808c1f2-e2ab-4d58-a329-b1d4007917a4-tmp\") pod \"jobset-operator-747c5859c7-fkq8p\" (UID: \"4808c1f2-e2ab-4d58-a329-b1d4007917a4\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-fkq8p" Apr 21 15:16:50.506833 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.506812 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gpgg\" (UniqueName: \"kubernetes.io/projected/4808c1f2-e2ab-4d58-a329-b1d4007917a4-kube-api-access-7gpgg\") pod \"jobset-operator-747c5859c7-fkq8p\" (UID: \"4808c1f2-e2ab-4d58-a329-b1d4007917a4\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-fkq8p" Apr 21 15:16:50.606940 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.606882 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-fkq8p" Apr 21 15:16:50.729018 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:16:50.728984 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4808c1f2_e2ab_4d58_a329_b1d4007917a4.slice/crio-9a1a263bb808db7a72c503824367b57858e71fb49964fd0cd34845a4b53e3d26 WatchSource:0}: Error finding container 9a1a263bb808db7a72c503824367b57858e71fb49964fd0cd34845a4b53e3d26: Status 404 returned error can't find the container with id 9a1a263bb808db7a72c503824367b57858e71fb49964fd0cd34845a4b53e3d26 Apr 21 15:16:50.730698 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.730682 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:16:50.730967 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.730948 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-fkq8p"] Apr 21 15:16:50.814708 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:50.814676 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-fkq8p" event={"ID":"4808c1f2-e2ab-4d58-a329-b1d4007917a4","Type":"ContainerStarted","Data":"9a1a263bb808db7a72c503824367b57858e71fb49964fd0cd34845a4b53e3d26"} Apr 21 15:16:54.826606 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:16:54.826575 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-fkq8p" event={"ID":"4808c1f2-e2ab-4d58-a329-b1d4007917a4","Type":"ContainerStarted","Data":"d83d1afab222e6e03d9746df9f0d56208ebaddd2045487b5a7eefa4dba54909c"} Apr 21 15:19:09.627627 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:19:09.627562 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-fkq8p" podStartSLOduration=136.154622975 podStartE2EDuration="2m19.627543688s" podCreationTimestamp="2026-04-21 15:16:50 +0000 UTC" firstStartedPulling="2026-04-21 15:16:50.730837016 +0000 UTC m=+376.610608807" lastFinishedPulling="2026-04-21 15:16:54.203757716 +0000 UTC m=+380.083529520" observedRunningTime="2026-04-21 15:16:54.848827252 +0000 UTC m=+380.728599064" watchObservedRunningTime="2026-04-21 15:19:09.627543688 +0000 UTC m=+515.507315501" Apr 21 15:19:09.628245 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:19:09.627705 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6"] Apr 21 15:19:09.630750 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:19:09.630727 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6" Apr 21 15:19:09.633524 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:19:09.633500 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-92tmr\"/\"default-dockercfg-hkrtq\"" Apr 21 15:19:09.634336 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:19:09.634320 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-92tmr\"/\"openshift-service-ca.crt\"" Apr 21 15:19:09.634395 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:19:09.634359 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-92tmr\"/\"kube-root-ca.crt\"" Apr 21 15:19:09.666698 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:19:09.666671 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6"] Apr 21 15:19:09.753722 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:19:09.753687 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnbh4\" (UniqueName: \"kubernetes.io/projected/c041aae4-d3ed-44c5-a8a6-00b55553dfcc-kube-api-access-gnbh4\") pod \"progression-enabled-node-0-0-cnbd6\" (UID: \"c041aae4-d3ed-44c5-a8a6-00b55553dfcc\") " pod="rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6" Apr 21 15:19:09.854397 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:19:09.854364 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnbh4\" (UniqueName: \"kubernetes.io/projected/c041aae4-d3ed-44c5-a8a6-00b55553dfcc-kube-api-access-gnbh4\") pod \"progression-enabled-node-0-0-cnbd6\" (UID: \"c041aae4-d3ed-44c5-a8a6-00b55553dfcc\") " pod="rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6" Apr 21 15:19:09.863958 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:19:09.863930 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnbh4\" (UniqueName: \"kubernetes.io/projected/c041aae4-d3ed-44c5-a8a6-00b55553dfcc-kube-api-access-gnbh4\") pod \"progression-enabled-node-0-0-cnbd6\" (UID: \"c041aae4-d3ed-44c5-a8a6-00b55553dfcc\") " pod="rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6" Apr 21 15:19:09.940650 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:19:09.940589 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6" Apr 21 15:19:10.066758 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:19:10.066725 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6"] Apr 21 15:19:10.070970 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:19:10.070945 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc041aae4_d3ed_44c5_a8a6_00b55553dfcc.slice/crio-320e793c1fc53ba05c691417072ed7223adeee2fa0b64651ec22942886b6e6a3 WatchSource:0}: Error finding container 320e793c1fc53ba05c691417072ed7223adeee2fa0b64651ec22942886b6e6a3: Status 404 returned error can't find the container with id 320e793c1fc53ba05c691417072ed7223adeee2fa0b64651ec22942886b6e6a3 Apr 21 15:19:10.161832 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:19:10.161801 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6" event={"ID":"c041aae4-d3ed-44c5-a8a6-00b55553dfcc","Type":"ContainerStarted","Data":"320e793c1fc53ba05c691417072ed7223adeee2fa0b64651ec22942886b6e6a3"} Apr 21 15:21:11.494874 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:11.494837 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6" event={"ID":"c041aae4-d3ed-44c5-a8a6-00b55553dfcc","Type":"ContainerStarted","Data":"e456c5b7c72344ff41150f1ab96220f0bfc98368d735b805da7d1f72b2cbc473"} Apr 21 15:21:11.573797 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:11.494959 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6" Apr 21 15:21:11.573797 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:11.520966 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6" podStartSLOduration=1.467698813 podStartE2EDuration="2m2.520953959s" podCreationTimestamp="2026-04-21 15:19:09 +0000 UTC" firstStartedPulling="2026-04-21 15:19:10.073200048 +0000 UTC m=+515.952971839" lastFinishedPulling="2026-04-21 15:21:11.126455174 +0000 UTC m=+637.006226985" observedRunningTime="2026-04-21 15:21:11.519875052 +0000 UTC m=+637.399646865" watchObservedRunningTime="2026-04-21 15:21:11.520953959 +0000 UTC m=+637.400725772" Apr 21 15:21:13.501016 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:13.500989 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6" Apr 21 15:21:34.498895 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:34.498855 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6" podUID="c041aae4-d3ed-44c5-a8a6-00b55553dfcc" containerName="node" probeResult="failure" output="Get \"http://10.133.0.14:28080/metrics\": dial tcp 10.133.0.14:28080: connect: connection refused" Apr 21 15:21:34.556556 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:34.556527 2583 generic.go:358] "Generic (PLEG): container finished" podID="c041aae4-d3ed-44c5-a8a6-00b55553dfcc" containerID="e456c5b7c72344ff41150f1ab96220f0bfc98368d735b805da7d1f72b2cbc473" exitCode=0 Apr 21 15:21:34.556682 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:34.556592 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6" event={"ID":"c041aae4-d3ed-44c5-a8a6-00b55553dfcc","Type":"ContainerDied","Data":"e456c5b7c72344ff41150f1ab96220f0bfc98368d735b805da7d1f72b2cbc473"} Apr 21 15:21:35.678733 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:35.678713 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6" Apr 21 15:21:35.772626 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:35.772592 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnbh4\" (UniqueName: \"kubernetes.io/projected/c041aae4-d3ed-44c5-a8a6-00b55553dfcc-kube-api-access-gnbh4\") pod \"c041aae4-d3ed-44c5-a8a6-00b55553dfcc\" (UID: \"c041aae4-d3ed-44c5-a8a6-00b55553dfcc\") " Apr 21 15:21:35.774700 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:35.774667 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c041aae4-d3ed-44c5-a8a6-00b55553dfcc-kube-api-access-gnbh4" (OuterVolumeSpecName: "kube-api-access-gnbh4") pod "c041aae4-d3ed-44c5-a8a6-00b55553dfcc" (UID: "c041aae4-d3ed-44c5-a8a6-00b55553dfcc"). InnerVolumeSpecName "kube-api-access-gnbh4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:21:35.873791 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:35.873713 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gnbh4\" (UniqueName: \"kubernetes.io/projected/c041aae4-d3ed-44c5-a8a6-00b55553dfcc-kube-api-access-gnbh4\") on node \"ip-10-0-143-120.ec2.internal\" DevicePath \"\"" Apr 21 15:21:36.562605 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:36.562569 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6" event={"ID":"c041aae4-d3ed-44c5-a8a6-00b55553dfcc","Type":"ContainerDied","Data":"320e793c1fc53ba05c691417072ed7223adeee2fa0b64651ec22942886b6e6a3"} Apr 21 15:21:36.562605 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:36.562606 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="320e793c1fc53ba05c691417072ed7223adeee2fa0b64651ec22942886b6e6a3" Apr 21 15:21:36.562855 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:36.562626 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6" Apr 21 15:21:38.018210 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:38.018176 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg"] Apr 21 15:21:38.018662 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:38.018404 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c041aae4-d3ed-44c5-a8a6-00b55553dfcc" containerName="node" Apr 21 15:21:38.018662 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:38.018414 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="c041aae4-d3ed-44c5-a8a6-00b55553dfcc" containerName="node" Apr 21 15:21:38.018662 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:38.018467 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="c041aae4-d3ed-44c5-a8a6-00b55553dfcc" containerName="node" Apr 21 15:21:38.039986 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:38.039963 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg"] Apr 21 15:21:38.040113 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:38.040071 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg" Apr 21 15:21:38.042482 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:38.042462 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-92tmr\"/\"default-dockercfg-hkrtq\"" Apr 21 15:21:38.042606 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:38.042508 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-92tmr\"/\"openshift-service-ca.crt\"" Apr 21 15:21:38.042606 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:38.042543 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-92tmr\"/\"kube-root-ca.crt\"" Apr 21 15:21:38.090711 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:38.090683 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kvxq\" (UniqueName: \"kubernetes.io/projected/dea0e27e-666a-4915-9207-27ec45e83692-kube-api-access-4kvxq\") pod \"progression-disabled-node-0-0-ql7tg\" (UID: \"dea0e27e-666a-4915-9207-27ec45e83692\") " pod="rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg" Apr 21 15:21:38.191898 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:38.191871 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kvxq\" (UniqueName: \"kubernetes.io/projected/dea0e27e-666a-4915-9207-27ec45e83692-kube-api-access-4kvxq\") pod \"progression-disabled-node-0-0-ql7tg\" (UID: \"dea0e27e-666a-4915-9207-27ec45e83692\") " pod="rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg" Apr 21 15:21:38.200636 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:38.200616 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kvxq\" (UniqueName: \"kubernetes.io/projected/dea0e27e-666a-4915-9207-27ec45e83692-kube-api-access-4kvxq\") pod \"progression-disabled-node-0-0-ql7tg\" (UID: \"dea0e27e-666a-4915-9207-27ec45e83692\") " pod="rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg" Apr 21 15:21:38.349175 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:38.349119 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg" Apr 21 15:21:38.459027 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:38.458988 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg"] Apr 21 15:21:38.462072 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:21:38.462038 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddea0e27e_666a_4915_9207_27ec45e83692.slice/crio-9b45cad964a57473272f619e5afe7ecc23c03561ac7ff42235fb4c7b006d37a6 WatchSource:0}: Error finding container 9b45cad964a57473272f619e5afe7ecc23c03561ac7ff42235fb4c7b006d37a6: Status 404 returned error can't find the container with id 9b45cad964a57473272f619e5afe7ecc23c03561ac7ff42235fb4c7b006d37a6 Apr 21 15:21:38.568762 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:38.568729 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg" event={"ID":"dea0e27e-666a-4915-9207-27ec45e83692","Type":"ContainerStarted","Data":"9b45cad964a57473272f619e5afe7ecc23c03561ac7ff42235fb4c7b006d37a6"} Apr 21 15:21:39.573033 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:39.572999 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg" event={"ID":"dea0e27e-666a-4915-9207-27ec45e83692","Type":"ContainerStarted","Data":"e17b51e8d47af088006d1779ba2b6223da7508dbf420ffc4ecdcc5ed5dcac762"} Apr 21 15:21:39.573484 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:39.573169 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg" Apr 21 15:21:39.589270 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:39.589222 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg" podStartSLOduration=2.589209771 podStartE2EDuration="2.589209771s" podCreationTimestamp="2026-04-21 15:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:21:39.58749978 +0000 UTC m=+665.467271593" watchObservedRunningTime="2026-04-21 15:21:39.589209771 +0000 UTC m=+665.468981583" Apr 21 15:21:40.575854 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:21:40.575818 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg" Apr 21 15:22:01.579396 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:01.579358 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg" podUID="dea0e27e-666a-4915-9207-27ec45e83692" containerName="node" probeResult="failure" output="Get \"http://10.133.0.15:28080/metrics\": read tcp 10.133.0.2:58896->10.133.0.15:28080: read: connection reset by peer" Apr 21 15:22:01.630779 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:01.630740 2583 generic.go:358] "Generic (PLEG): container finished" podID="dea0e27e-666a-4915-9207-27ec45e83692" containerID="e17b51e8d47af088006d1779ba2b6223da7508dbf420ffc4ecdcc5ed5dcac762" exitCode=0 Apr 21 15:22:01.630873 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:01.630806 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg" event={"ID":"dea0e27e-666a-4915-9207-27ec45e83692","Type":"ContainerDied","Data":"e17b51e8d47af088006d1779ba2b6223da7508dbf420ffc4ecdcc5ed5dcac762"} Apr 21 15:22:02.754090 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:02.754068 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg" Apr 21 15:22:02.861007 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:02.860978 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kvxq\" (UniqueName: \"kubernetes.io/projected/dea0e27e-666a-4915-9207-27ec45e83692-kube-api-access-4kvxq\") pod \"dea0e27e-666a-4915-9207-27ec45e83692\" (UID: \"dea0e27e-666a-4915-9207-27ec45e83692\") " Apr 21 15:22:02.862943 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:02.862919 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dea0e27e-666a-4915-9207-27ec45e83692-kube-api-access-4kvxq" (OuterVolumeSpecName: "kube-api-access-4kvxq") pod "dea0e27e-666a-4915-9207-27ec45e83692" (UID: "dea0e27e-666a-4915-9207-27ec45e83692"). InnerVolumeSpecName "kube-api-access-4kvxq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:22:02.961884 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:02.961855 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4kvxq\" (UniqueName: \"kubernetes.io/projected/dea0e27e-666a-4915-9207-27ec45e83692-kube-api-access-4kvxq\") on node \"ip-10-0-143-120.ec2.internal\" DevicePath \"\"" Apr 21 15:22:03.637522 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:03.637488 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg" event={"ID":"dea0e27e-666a-4915-9207-27ec45e83692","Type":"ContainerDied","Data":"9b45cad964a57473272f619e5afe7ecc23c03561ac7ff42235fb4c7b006d37a6"} Apr 21 15:22:03.637522 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:03.637514 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg" Apr 21 15:22:03.637728 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:03.637521 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b45cad964a57473272f619e5afe7ecc23c03561ac7ff42235fb4c7b006d37a6" Apr 21 15:22:13.035834 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.035802 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67"] Apr 21 15:22:13.036182 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.036037 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dea0e27e-666a-4915-9207-27ec45e83692" containerName="node" Apr 21 15:22:13.036182 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.036048 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea0e27e-666a-4915-9207-27ec45e83692" containerName="node" Apr 21 15:22:13.036182 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.036092 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="dea0e27e-666a-4915-9207-27ec45e83692" containerName="node" Apr 21 15:22:13.039108 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.039089 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67" Apr 21 15:22:13.042886 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.042867 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-92tmr\"/\"default-dockercfg-hkrtq\"" Apr 21 15:22:13.043129 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.043109 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-92tmr\"/\"kube-root-ca.crt\"" Apr 21 15:22:13.043602 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.043588 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-92tmr\"/\"openshift-service-ca.crt\"" Apr 21 15:22:13.051952 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.051933 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67"] Apr 21 15:22:13.126538 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.126509 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwhhh\" (UniqueName: \"kubernetes.io/projected/52a00751-aaec-462a-ab1c-d8fa005cfb46-kube-api-access-qwhhh\") pod \"progression-invalid-node-0-0-znm67\" (UID: \"52a00751-aaec-462a-ab1c-d8fa005cfb46\") " pod="rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67" Apr 21 15:22:13.227050 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.227019 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwhhh\" (UniqueName: \"kubernetes.io/projected/52a00751-aaec-462a-ab1c-d8fa005cfb46-kube-api-access-qwhhh\") pod \"progression-invalid-node-0-0-znm67\" (UID: \"52a00751-aaec-462a-ab1c-d8fa005cfb46\") " pod="rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67" Apr 21 15:22:13.240089 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.240063 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwhhh\" (UniqueName: \"kubernetes.io/projected/52a00751-aaec-462a-ab1c-d8fa005cfb46-kube-api-access-qwhhh\") pod \"progression-invalid-node-0-0-znm67\" (UID: \"52a00751-aaec-462a-ab1c-d8fa005cfb46\") " pod="rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67" Apr 21 15:22:13.347331 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.347261 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67" Apr 21 15:22:13.473129 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.473102 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67"] Apr 21 15:22:13.476271 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:22:13.476245 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52a00751_aaec_462a_ab1c_d8fa005cfb46.slice/crio-3e7da5e5ce53432d1c27e77db0f8c902ae7d011554258814f556512ffa51342d WatchSource:0}: Error finding container 3e7da5e5ce53432d1c27e77db0f8c902ae7d011554258814f556512ffa51342d: Status 404 returned error can't find the container with id 3e7da5e5ce53432d1c27e77db0f8c902ae7d011554258814f556512ffa51342d Apr 21 15:22:13.478518 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.478501 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:22:13.664884 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.664769 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67" event={"ID":"52a00751-aaec-462a-ab1c-d8fa005cfb46","Type":"ContainerStarted","Data":"48efa9f19249314fe6cf49a78b9570983a3f9e584958613ebedc6444513dd425"} Apr 21 15:22:13.664884 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.664824 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67" event={"ID":"52a00751-aaec-462a-ab1c-d8fa005cfb46","Type":"ContainerStarted","Data":"3e7da5e5ce53432d1c27e77db0f8c902ae7d011554258814f556512ffa51342d"} Apr 21 15:22:13.665063 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.664894 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67" Apr 21 15:22:13.688578 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:13.688527 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67" podStartSLOduration=0.688508813 podStartE2EDuration="688.508813ms" podCreationTimestamp="2026-04-21 15:22:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:22:13.684881394 +0000 UTC m=+699.564653201" watchObservedRunningTime="2026-04-21 15:22:13.688508813 +0000 UTC m=+699.568280628" Apr 21 15:22:15.671283 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:15.671253 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67" Apr 21 15:22:36.669101 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:36.669055 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67" podUID="52a00751-aaec-462a-ab1c-d8fa005cfb46" containerName="node" probeResult="failure" output="Get \"http://10.133.0.16:28080/metrics\": dial tcp 10.133.0.16:28080: connect: connection refused" Apr 21 15:22:36.727060 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:36.727027 2583 generic.go:358] "Generic (PLEG): container finished" podID="52a00751-aaec-462a-ab1c-d8fa005cfb46" containerID="48efa9f19249314fe6cf49a78b9570983a3f9e584958613ebedc6444513dd425" exitCode=0 Apr 21 15:22:36.727208 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:36.727085 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67" event={"ID":"52a00751-aaec-462a-ab1c-d8fa005cfb46","Type":"ContainerDied","Data":"48efa9f19249314fe6cf49a78b9570983a3f9e584958613ebedc6444513dd425"} Apr 21 15:22:37.842931 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:37.842908 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67" Apr 21 15:22:37.998583 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:37.998508 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwhhh\" (UniqueName: \"kubernetes.io/projected/52a00751-aaec-462a-ab1c-d8fa005cfb46-kube-api-access-qwhhh\") pod \"52a00751-aaec-462a-ab1c-d8fa005cfb46\" (UID: \"52a00751-aaec-462a-ab1c-d8fa005cfb46\") " Apr 21 15:22:38.000501 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:38.000466 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52a00751-aaec-462a-ab1c-d8fa005cfb46-kube-api-access-qwhhh" (OuterVolumeSpecName: "kube-api-access-qwhhh") pod "52a00751-aaec-462a-ab1c-d8fa005cfb46" (UID: "52a00751-aaec-462a-ab1c-d8fa005cfb46"). InnerVolumeSpecName "kube-api-access-qwhhh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:22:38.099325 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:38.099292 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwhhh\" (UniqueName: \"kubernetes.io/projected/52a00751-aaec-462a-ab1c-d8fa005cfb46-kube-api-access-qwhhh\") on node \"ip-10-0-143-120.ec2.internal\" DevicePath \"\"" Apr 21 15:22:38.734053 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:38.734018 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67" event={"ID":"52a00751-aaec-462a-ab1c-d8fa005cfb46","Type":"ContainerDied","Data":"3e7da5e5ce53432d1c27e77db0f8c902ae7d011554258814f556512ffa51342d"} Apr 21 15:22:38.734053 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:38.734053 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e7da5e5ce53432d1c27e77db0f8c902ae7d011554258814f556512ffa51342d" Apr 21 15:22:38.734245 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:22:38.734076 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67" Apr 21 15:24:34.635588 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:34.635558 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq"] Apr 21 15:24:34.636273 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:34.635801 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52a00751-aaec-462a-ab1c-d8fa005cfb46" containerName="node" Apr 21 15:24:34.636273 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:34.635811 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="52a00751-aaec-462a-ab1c-d8fa005cfb46" containerName="node" Apr 21 15:24:34.636273 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:34.635873 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="52a00751-aaec-462a-ab1c-d8fa005cfb46" containerName="node" Apr 21 15:24:34.638537 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:34.638520 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq" Apr 21 15:24:34.640699 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:34.640676 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-92tmr\"/\"kube-root-ca.crt\"" Apr 21 15:24:34.641471 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:34.641451 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-92tmr\"/\"default-dockercfg-hkrtq\"" Apr 21 15:24:34.641575 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:34.641474 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-92tmr\"/\"openshift-service-ca.crt\"" Apr 21 15:24:34.645952 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:34.645926 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq"] Apr 21 15:24:34.784649 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:34.784621 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnc9g\" (UniqueName: \"kubernetes.io/projected/063647a9-1ab9-4bb7-a48a-3ac175a34543-kube-api-access-hnc9g\") pod \"progression-no-metrics-node-0-0-kdmlq\" (UID: \"063647a9-1ab9-4bb7-a48a-3ac175a34543\") " pod="rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq" Apr 21 15:24:34.885667 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:34.885574 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnc9g\" (UniqueName: \"kubernetes.io/projected/063647a9-1ab9-4bb7-a48a-3ac175a34543-kube-api-access-hnc9g\") pod \"progression-no-metrics-node-0-0-kdmlq\" (UID: \"063647a9-1ab9-4bb7-a48a-3ac175a34543\") " pod="rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq" Apr 21 15:24:34.895170 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:34.895144 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnc9g\" (UniqueName: \"kubernetes.io/projected/063647a9-1ab9-4bb7-a48a-3ac175a34543-kube-api-access-hnc9g\") pod \"progression-no-metrics-node-0-0-kdmlq\" (UID: \"063647a9-1ab9-4bb7-a48a-3ac175a34543\") " pod="rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq" Apr 21 15:24:34.948011 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:34.947986 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq" Apr 21 15:24:35.061863 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:35.061817 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq"] Apr 21 15:24:35.064608 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:24:35.064585 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod063647a9_1ab9_4bb7_a48a_3ac175a34543.slice/crio-28f3c5045fa1a97842fb5be2936c7d5a3b8a9f4f380c2d10db4b77d5ec056120 WatchSource:0}: Error finding container 28f3c5045fa1a97842fb5be2936c7d5a3b8a9f4f380c2d10db4b77d5ec056120: Status 404 returned error can't find the container with id 28f3c5045fa1a97842fb5be2936c7d5a3b8a9f4f380c2d10db4b77d5ec056120 Apr 21 15:24:36.052174 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:36.052142 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq" event={"ID":"063647a9-1ab9-4bb7-a48a-3ac175a34543","Type":"ContainerStarted","Data":"bb7a10b4253abe4bffae0bb9e62d24fa1596cbb45215241b1a3e752094ff5e6e"} Apr 21 15:24:36.052174 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:36.052179 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq" event={"ID":"063647a9-1ab9-4bb7-a48a-3ac175a34543","Type":"ContainerStarted","Data":"28f3c5045fa1a97842fb5be2936c7d5a3b8a9f4f380c2d10db4b77d5ec056120"} Apr 21 15:24:36.067406 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:36.067363 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq" podStartSLOduration=2.067350019 podStartE2EDuration="2.067350019s" podCreationTimestamp="2026-04-21 15:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:24:36.066272926 +0000 UTC m=+841.946044729" watchObservedRunningTime="2026-04-21 15:24:36.067350019 +0000 UTC m=+841.947121831" Apr 21 15:24:41.066381 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:41.066345 2583 generic.go:358] "Generic (PLEG): container finished" podID="063647a9-1ab9-4bb7-a48a-3ac175a34543" containerID="bb7a10b4253abe4bffae0bb9e62d24fa1596cbb45215241b1a3e752094ff5e6e" exitCode=0 Apr 21 15:24:41.066381 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:41.066381 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq" event={"ID":"063647a9-1ab9-4bb7-a48a-3ac175a34543","Type":"ContainerDied","Data":"bb7a10b4253abe4bffae0bb9e62d24fa1596cbb45215241b1a3e752094ff5e6e"} Apr 21 15:24:42.181378 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:42.181355 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq" Apr 21 15:24:42.339212 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:42.339133 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnc9g\" (UniqueName: \"kubernetes.io/projected/063647a9-1ab9-4bb7-a48a-3ac175a34543-kube-api-access-hnc9g\") pod \"063647a9-1ab9-4bb7-a48a-3ac175a34543\" (UID: \"063647a9-1ab9-4bb7-a48a-3ac175a34543\") " Apr 21 15:24:42.341107 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:42.341075 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/063647a9-1ab9-4bb7-a48a-3ac175a34543-kube-api-access-hnc9g" (OuterVolumeSpecName: "kube-api-access-hnc9g") pod "063647a9-1ab9-4bb7-a48a-3ac175a34543" (UID: "063647a9-1ab9-4bb7-a48a-3ac175a34543"). InnerVolumeSpecName "kube-api-access-hnc9g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:24:42.440047 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:42.440020 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hnc9g\" (UniqueName: \"kubernetes.io/projected/063647a9-1ab9-4bb7-a48a-3ac175a34543-kube-api-access-hnc9g\") on node \"ip-10-0-143-120.ec2.internal\" DevicePath \"\"" Apr 21 15:24:43.076607 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:43.076537 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq" Apr 21 15:24:43.076753 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:43.076533 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq" event={"ID":"063647a9-1ab9-4bb7-a48a-3ac175a34543","Type":"ContainerDied","Data":"28f3c5045fa1a97842fb5be2936c7d5a3b8a9f4f380c2d10db4b77d5ec056120"} Apr 21 15:24:43.076753 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:43.076637 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28f3c5045fa1a97842fb5be2936c7d5a3b8a9f4f380c2d10db4b77d5ec056120" Apr 21 15:24:46.793511 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:46.793477 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5v6kh/must-gather-2wc9p"] Apr 21 15:24:46.793908 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:46.793695 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="063647a9-1ab9-4bb7-a48a-3ac175a34543" containerName="node" Apr 21 15:24:46.793908 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:46.793705 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="063647a9-1ab9-4bb7-a48a-3ac175a34543" containerName="node" Apr 21 15:24:46.793908 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:46.793760 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="063647a9-1ab9-4bb7-a48a-3ac175a34543" containerName="node" Apr 21 15:24:46.796763 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:46.796747 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5v6kh/must-gather-2wc9p" Apr 21 15:24:46.799055 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:46.799025 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5v6kh\"/\"default-dockercfg-f2qzk\"" Apr 21 15:24:46.799895 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:46.799878 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5v6kh\"/\"kube-root-ca.crt\"" Apr 21 15:24:46.799988 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:46.799939 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5v6kh\"/\"openshift-service-ca.crt\"" Apr 21 15:24:46.805455 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:46.805435 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5v6kh/must-gather-2wc9p"] Apr 21 15:24:46.865953 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:46.865931 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f5be69dd-0d09-45cb-82d2-7dc3add1d4c3-must-gather-output\") pod \"must-gather-2wc9p\" (UID: \"f5be69dd-0d09-45cb-82d2-7dc3add1d4c3\") " pod="openshift-must-gather-5v6kh/must-gather-2wc9p" Apr 21 15:24:46.866071 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:46.865967 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh9t4\" (UniqueName: \"kubernetes.io/projected/f5be69dd-0d09-45cb-82d2-7dc3add1d4c3-kube-api-access-sh9t4\") pod \"must-gather-2wc9p\" (UID: \"f5be69dd-0d09-45cb-82d2-7dc3add1d4c3\") " pod="openshift-must-gather-5v6kh/must-gather-2wc9p" Apr 21 15:24:46.967170 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:46.967147 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f5be69dd-0d09-45cb-82d2-7dc3add1d4c3-must-gather-output\") pod \"must-gather-2wc9p\" (UID: \"f5be69dd-0d09-45cb-82d2-7dc3add1d4c3\") " pod="openshift-must-gather-5v6kh/must-gather-2wc9p" Apr 21 15:24:46.967293 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:46.967186 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sh9t4\" (UniqueName: \"kubernetes.io/projected/f5be69dd-0d09-45cb-82d2-7dc3add1d4c3-kube-api-access-sh9t4\") pod \"must-gather-2wc9p\" (UID: \"f5be69dd-0d09-45cb-82d2-7dc3add1d4c3\") " pod="openshift-must-gather-5v6kh/must-gather-2wc9p" Apr 21 15:24:46.967561 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:46.967543 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f5be69dd-0d09-45cb-82d2-7dc3add1d4c3-must-gather-output\") pod \"must-gather-2wc9p\" (UID: \"f5be69dd-0d09-45cb-82d2-7dc3add1d4c3\") " pod="openshift-must-gather-5v6kh/must-gather-2wc9p" Apr 21 15:24:46.974878 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:46.974857 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh9t4\" (UniqueName: \"kubernetes.io/projected/f5be69dd-0d09-45cb-82d2-7dc3add1d4c3-kube-api-access-sh9t4\") pod \"must-gather-2wc9p\" (UID: \"f5be69dd-0d09-45cb-82d2-7dc3add1d4c3\") " pod="openshift-must-gather-5v6kh/must-gather-2wc9p" Apr 21 15:24:47.105269 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:47.105203 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5v6kh/must-gather-2wc9p" Apr 21 15:24:47.217837 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:47.217807 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5v6kh/must-gather-2wc9p"] Apr 21 15:24:47.220727 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:24:47.220698 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5be69dd_0d09_45cb_82d2_7dc3add1d4c3.slice/crio-e5103a2efbfe619795719368156d72de61a9ab302cc67a4ddccfc741ceac3164 WatchSource:0}: Error finding container e5103a2efbfe619795719368156d72de61a9ab302cc67a4ddccfc741ceac3164: Status 404 returned error can't find the container with id e5103a2efbfe619795719368156d72de61a9ab302cc67a4ddccfc741ceac3164 Apr 21 15:24:48.093161 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:48.093125 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5v6kh/must-gather-2wc9p" event={"ID":"f5be69dd-0d09-45cb-82d2-7dc3add1d4c3","Type":"ContainerStarted","Data":"e5103a2efbfe619795719368156d72de61a9ab302cc67a4ddccfc741ceac3164"} Apr 21 15:24:51.606886 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:51.606845 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg"] Apr 21 15:24:51.612790 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:51.612742 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-92tmr/progression-disabled-node-0-0-ql7tg"] Apr 21 15:24:51.618036 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:51.618007 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6"] Apr 21 15:24:51.621411 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:51.621383 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-92tmr/progression-enabled-node-0-0-cnbd6"] Apr 21 15:24:51.625860 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:51.625839 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67"] Apr 21 15:24:51.629940 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:51.629920 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-92tmr/progression-invalid-node-0-0-znm67"] Apr 21 15:24:51.647084 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:51.647056 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq"] Apr 21 15:24:51.652736 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:51.652711 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-92tmr/progression-no-metrics-node-0-0-kdmlq"] Apr 21 15:24:52.748816 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:52.748760 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="063647a9-1ab9-4bb7-a48a-3ac175a34543" path="/var/lib/kubelet/pods/063647a9-1ab9-4bb7-a48a-3ac175a34543/volumes" Apr 21 15:24:52.749289 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:52.749144 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52a00751-aaec-462a-ab1c-d8fa005cfb46" path="/var/lib/kubelet/pods/52a00751-aaec-462a-ab1c-d8fa005cfb46/volumes" Apr 21 15:24:52.749468 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:52.749453 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c041aae4-d3ed-44c5-a8a6-00b55553dfcc" path="/var/lib/kubelet/pods/c041aae4-d3ed-44c5-a8a6-00b55553dfcc/volumes" Apr 21 15:24:52.749758 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:52.749745 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dea0e27e-666a-4915-9207-27ec45e83692" path="/var/lib/kubelet/pods/dea0e27e-666a-4915-9207-27ec45e83692/volumes" Apr 21 15:24:53.109398 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:53.109312 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5v6kh/must-gather-2wc9p" event={"ID":"f5be69dd-0d09-45cb-82d2-7dc3add1d4c3","Type":"ContainerStarted","Data":"00163bebef1b3f9340330e0b8cd0e6023519471fab9821b5f94fc580c1538282"} Apr 21 15:24:53.109398 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:53.109354 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5v6kh/must-gather-2wc9p" event={"ID":"f5be69dd-0d09-45cb-82d2-7dc3add1d4c3","Type":"ContainerStarted","Data":"49c2b1188f92377f98c2307afdeabb0adf367cd469e92dd981fb149fa82d7566"} Apr 21 15:24:53.127276 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:24:53.127222 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5v6kh/must-gather-2wc9p" podStartSLOduration=2.322068501 podStartE2EDuration="7.127201157s" podCreationTimestamp="2026-04-21 15:24:46 +0000 UTC" firstStartedPulling="2026-04-21 15:24:47.222311815 +0000 UTC m=+853.102083607" lastFinishedPulling="2026-04-21 15:24:52.027444469 +0000 UTC m=+857.907216263" observedRunningTime="2026-04-21 15:24:53.126331579 +0000 UTC m=+859.006103391" watchObservedRunningTime="2026-04-21 15:24:53.127201157 +0000 UTC m=+859.006972971" Apr 21 15:25:40.247183 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:40.247148 2583 generic.go:358] "Generic (PLEG): container finished" podID="f5be69dd-0d09-45cb-82d2-7dc3add1d4c3" containerID="49c2b1188f92377f98c2307afdeabb0adf367cd469e92dd981fb149fa82d7566" exitCode=0 Apr 21 15:25:40.247583 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:40.247187 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5v6kh/must-gather-2wc9p" event={"ID":"f5be69dd-0d09-45cb-82d2-7dc3add1d4c3","Type":"ContainerDied","Data":"49c2b1188f92377f98c2307afdeabb0adf367cd469e92dd981fb149fa82d7566"} Apr 21 15:25:40.247583 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:40.247506 2583 scope.go:117] "RemoveContainer" containerID="49c2b1188f92377f98c2307afdeabb0adf367cd469e92dd981fb149fa82d7566" Apr 21 15:25:40.643691 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:40.643622 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5v6kh_must-gather-2wc9p_f5be69dd-0d09-45cb-82d2-7dc3add1d4c3/gather/0.log" Apr 21 15:25:41.176225 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:41.176186 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p2xz6/must-gather-gknmd"] Apr 21 15:25:41.179501 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:41.179478 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2xz6/must-gather-gknmd" Apr 21 15:25:41.182165 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:41.182146 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p2xz6\"/\"openshift-service-ca.crt\"" Apr 21 15:25:41.182395 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:41.182379 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-p2xz6\"/\"default-dockercfg-flhhr\"" Apr 21 15:25:41.183013 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:41.182999 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p2xz6\"/\"kube-root-ca.crt\"" Apr 21 15:25:41.187079 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:41.187060 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p2xz6/must-gather-gknmd"] Apr 21 15:25:41.301194 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:41.301167 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwwbc\" (UniqueName: \"kubernetes.io/projected/a220a938-f1ec-4ad6-a50e-c54634ace07e-kube-api-access-rwwbc\") pod \"must-gather-gknmd\" (UID: \"a220a938-f1ec-4ad6-a50e-c54634ace07e\") " pod="openshift-must-gather-p2xz6/must-gather-gknmd" Apr 21 15:25:41.301523 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:41.301223 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a220a938-f1ec-4ad6-a50e-c54634ace07e-must-gather-output\") pod \"must-gather-gknmd\" (UID: \"a220a938-f1ec-4ad6-a50e-c54634ace07e\") " pod="openshift-must-gather-p2xz6/must-gather-gknmd" Apr 21 15:25:41.401515 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:41.401491 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a220a938-f1ec-4ad6-a50e-c54634ace07e-must-gather-output\") pod \"must-gather-gknmd\" (UID: \"a220a938-f1ec-4ad6-a50e-c54634ace07e\") " pod="openshift-must-gather-p2xz6/must-gather-gknmd" Apr 21 15:25:41.401636 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:41.401548 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwwbc\" (UniqueName: \"kubernetes.io/projected/a220a938-f1ec-4ad6-a50e-c54634ace07e-kube-api-access-rwwbc\") pod \"must-gather-gknmd\" (UID: \"a220a938-f1ec-4ad6-a50e-c54634ace07e\") " pod="openshift-must-gather-p2xz6/must-gather-gknmd" Apr 21 15:25:41.401844 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:41.401825 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a220a938-f1ec-4ad6-a50e-c54634ace07e-must-gather-output\") pod \"must-gather-gknmd\" (UID: \"a220a938-f1ec-4ad6-a50e-c54634ace07e\") " pod="openshift-must-gather-p2xz6/must-gather-gknmd" Apr 21 15:25:41.411058 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:41.411036 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwwbc\" (UniqueName: \"kubernetes.io/projected/a220a938-f1ec-4ad6-a50e-c54634ace07e-kube-api-access-rwwbc\") pod \"must-gather-gknmd\" (UID: \"a220a938-f1ec-4ad6-a50e-c54634ace07e\") " pod="openshift-must-gather-p2xz6/must-gather-gknmd" Apr 21 15:25:41.488759 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:41.488733 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2xz6/must-gather-gknmd" Apr 21 15:25:41.600937 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:41.600900 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p2xz6/must-gather-gknmd"] Apr 21 15:25:41.604233 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:25:41.604202 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda220a938_f1ec_4ad6_a50e_c54634ace07e.slice/crio-2953d95eeebdf0da1f5e82bc9ae7d39e6be211f030da20ffaae3ab4135a4aaba WatchSource:0}: Error finding container 2953d95eeebdf0da1f5e82bc9ae7d39e6be211f030da20ffaae3ab4135a4aaba: Status 404 returned error can't find the container with id 2953d95eeebdf0da1f5e82bc9ae7d39e6be211f030da20ffaae3ab4135a4aaba Apr 21 15:25:42.256755 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:42.256719 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2xz6/must-gather-gknmd" event={"ID":"a220a938-f1ec-4ad6-a50e-c54634ace07e","Type":"ContainerStarted","Data":"2953d95eeebdf0da1f5e82bc9ae7d39e6be211f030da20ffaae3ab4135a4aaba"} Apr 21 15:25:43.263182 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:43.263145 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2xz6/must-gather-gknmd" event={"ID":"a220a938-f1ec-4ad6-a50e-c54634ace07e","Type":"ContainerStarted","Data":"52927d548a23e1a863893940fb3299e2541b10ac69c6a3539424a902d12c4a8b"} Apr 21 15:25:43.263182 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:43.263192 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2xz6/must-gather-gknmd" event={"ID":"a220a938-f1ec-4ad6-a50e-c54634ace07e","Type":"ContainerStarted","Data":"0e31fbd6383988a7625f38cf016dc75b4c5255084a21c2ccc45735dbf6447f96"} Apr 21 15:25:43.297503 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:43.297455 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p2xz6/must-gather-gknmd" podStartSLOduration=1.660939113 podStartE2EDuration="2.297441656s" podCreationTimestamp="2026-04-21 15:25:41 +0000 UTC" firstStartedPulling="2026-04-21 15:25:41.60591942 +0000 UTC m=+907.485691211" lastFinishedPulling="2026-04-21 15:25:42.242421959 +0000 UTC m=+908.122193754" observedRunningTime="2026-04-21 15:25:43.297339041 +0000 UTC m=+909.177110855" watchObservedRunningTime="2026-04-21 15:25:43.297441656 +0000 UTC m=+909.177213468" Apr 21 15:25:43.691719 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:43.691647 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-b8vgq_c7fed088-3fe1-4c99-b3f4-37af1f9f317c/global-pull-secret-syncer/0.log" Apr 21 15:25:43.833866 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:43.833831 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4zr9g_9006254c-0f60-4649-85ee-dee1f0871d1b/konnectivity-agent/0.log" Apr 21 15:25:43.972188 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:43.972163 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-120.ec2.internal_e86a3d6d8eccc7f6cc8d288787df4758/haproxy/0.log" Apr 21 15:25:45.999193 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:45.997523 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5v6kh/must-gather-2wc9p"] Apr 21 15:25:45.999193 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:45.997854 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-5v6kh/must-gather-2wc9p" podUID="f5be69dd-0d09-45cb-82d2-7dc3add1d4c3" containerName="copy" containerID="cri-o://00163bebef1b3f9340330e0b8cd0e6023519471fab9821b5f94fc580c1538282" gracePeriod=2 Apr 21 15:25:46.002251 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:46.002201 2583 status_manager.go:895] "Failed to get status for pod" podUID="f5be69dd-0d09-45cb-82d2-7dc3add1d4c3" pod="openshift-must-gather-5v6kh/must-gather-2wc9p" err="pods \"must-gather-2wc9p\" is forbidden: User \"system:node:ip-10-0-143-120.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5v6kh\": no relationship found between node 'ip-10-0-143-120.ec2.internal' and this object" Apr 21 15:25:46.002849 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:46.002800 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5v6kh/must-gather-2wc9p"] Apr 21 15:25:46.280753 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:46.280562 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5v6kh_must-gather-2wc9p_f5be69dd-0d09-45cb-82d2-7dc3add1d4c3/copy/0.log" Apr 21 15:25:46.280991 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:46.280898 2583 generic.go:358] "Generic (PLEG): container finished" podID="f5be69dd-0d09-45cb-82d2-7dc3add1d4c3" containerID="00163bebef1b3f9340330e0b8cd0e6023519471fab9821b5f94fc580c1538282" exitCode=143 Apr 21 15:25:46.396795 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:46.385190 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5v6kh_must-gather-2wc9p_f5be69dd-0d09-45cb-82d2-7dc3add1d4c3/copy/0.log" Apr 21 15:25:46.396795 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:46.385658 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5v6kh/must-gather-2wc9p" Apr 21 15:25:46.396795 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:46.395944 2583 status_manager.go:895] "Failed to get status for pod" podUID="f5be69dd-0d09-45cb-82d2-7dc3add1d4c3" pod="openshift-must-gather-5v6kh/must-gather-2wc9p" err="pods \"must-gather-2wc9p\" is forbidden: User \"system:node:ip-10-0-143-120.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5v6kh\": no relationship found between node 'ip-10-0-143-120.ec2.internal' and this object" Apr 21 15:25:46.457022 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:46.456980 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f5be69dd-0d09-45cb-82d2-7dc3add1d4c3-must-gather-output\") pod \"f5be69dd-0d09-45cb-82d2-7dc3add1d4c3\" (UID: \"f5be69dd-0d09-45cb-82d2-7dc3add1d4c3\") " Apr 21 15:25:46.457284 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:46.457260 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh9t4\" (UniqueName: \"kubernetes.io/projected/f5be69dd-0d09-45cb-82d2-7dc3add1d4c3-kube-api-access-sh9t4\") pod \"f5be69dd-0d09-45cb-82d2-7dc3add1d4c3\" (UID: \"f5be69dd-0d09-45cb-82d2-7dc3add1d4c3\") " Apr 21 15:25:46.460662 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:46.460463 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5be69dd-0d09-45cb-82d2-7dc3add1d4c3-kube-api-access-sh9t4" (OuterVolumeSpecName: "kube-api-access-sh9t4") pod "f5be69dd-0d09-45cb-82d2-7dc3add1d4c3" (UID: "f5be69dd-0d09-45cb-82d2-7dc3add1d4c3"). InnerVolumeSpecName "kube-api-access-sh9t4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:25:46.463313 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:46.462961 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5be69dd-0d09-45cb-82d2-7dc3add1d4c3-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f5be69dd-0d09-45cb-82d2-7dc3add1d4c3" (UID: "f5be69dd-0d09-45cb-82d2-7dc3add1d4c3"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:25:46.559605 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:46.559497 2583 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f5be69dd-0d09-45cb-82d2-7dc3add1d4c3-must-gather-output\") on node \"ip-10-0-143-120.ec2.internal\" DevicePath \"\"" Apr 21 15:25:46.559605 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:46.559535 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sh9t4\" (UniqueName: \"kubernetes.io/projected/f5be69dd-0d09-45cb-82d2-7dc3add1d4c3-kube-api-access-sh9t4\") on node \"ip-10-0-143-120.ec2.internal\" DevicePath \"\"" Apr 21 15:25:46.750054 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:46.750014 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5be69dd-0d09-45cb-82d2-7dc3add1d4c3" path="/var/lib/kubelet/pods/f5be69dd-0d09-45cb-82d2-7dc3add1d4c3/volumes" Apr 21 15:25:47.288038 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:47.287959 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5v6kh_must-gather-2wc9p_f5be69dd-0d09-45cb-82d2-7dc3add1d4c3/copy/0.log" Apr 21 15:25:47.288551 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:47.288439 2583 scope.go:117] "RemoveContainer" containerID="00163bebef1b3f9340330e0b8cd0e6023519471fab9821b5f94fc580c1538282" Apr 21 15:25:47.288606 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:47.288567 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5v6kh/must-gather-2wc9p" Apr 21 15:25:47.298850 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:47.298802 2583 scope.go:117] "RemoveContainer" containerID="49c2b1188f92377f98c2307afdeabb0adf367cd469e92dd981fb149fa82d7566" Apr 21 15:25:47.471974 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:47.471894 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v42pl_68fd2d26-90ba-4d6a-af55-9bbf50606db8/node-exporter/0.log" Apr 21 15:25:47.501833 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:47.501805 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v42pl_68fd2d26-90ba-4d6a-af55-9bbf50606db8/kube-rbac-proxy/0.log" Apr 21 15:25:47.526885 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:47.526853 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v42pl_68fd2d26-90ba-4d6a-af55-9bbf50606db8/init-textfile/0.log" Apr 21 15:25:47.638755 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:47.638679 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-c98jg_b0e43122-ccd1-4865-8281-214200591357/kube-rbac-proxy-main/0.log" Apr 21 15:25:47.668114 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:47.668085 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-c98jg_b0e43122-ccd1-4865-8281-214200591357/kube-rbac-proxy-self/0.log" Apr 21 15:25:47.698864 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:47.698833 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-c98jg_b0e43122-ccd1-4865-8281-214200591357/openshift-state-metrics/0.log" Apr 21 15:25:50.100962 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.100923 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt"] Apr 21 15:25:50.101554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.101267 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5be69dd-0d09-45cb-82d2-7dc3add1d4c3" containerName="gather" Apr 21 15:25:50.101554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.101285 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5be69dd-0d09-45cb-82d2-7dc3add1d4c3" containerName="gather" Apr 21 15:25:50.101554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.101303 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5be69dd-0d09-45cb-82d2-7dc3add1d4c3" containerName="copy" Apr 21 15:25:50.101554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.101312 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5be69dd-0d09-45cb-82d2-7dc3add1d4c3" containerName="copy" Apr 21 15:25:50.101554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.101382 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5be69dd-0d09-45cb-82d2-7dc3add1d4c3" containerName="copy" Apr 21 15:25:50.101554 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.101395 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5be69dd-0d09-45cb-82d2-7dc3add1d4c3" containerName="gather" Apr 21 15:25:50.105303 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.105282 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.124826 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.124798 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt"] Apr 21 15:25:50.187753 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.187723 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3b6caf3f-9eea-4715-b33b-120506a42163-podres\") pod \"perf-node-gather-daemonset-jj2tt\" (UID: \"3b6caf3f-9eea-4715-b33b-120506a42163\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.187936 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.187787 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b6caf3f-9eea-4715-b33b-120506a42163-sys\") pod \"perf-node-gather-daemonset-jj2tt\" (UID: \"3b6caf3f-9eea-4715-b33b-120506a42163\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.187936 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.187837 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b675p\" (UniqueName: \"kubernetes.io/projected/3b6caf3f-9eea-4715-b33b-120506a42163-kube-api-access-b675p\") pod \"perf-node-gather-daemonset-jj2tt\" (UID: \"3b6caf3f-9eea-4715-b33b-120506a42163\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.187936 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.187870 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b6caf3f-9eea-4715-b33b-120506a42163-lib-modules\") pod \"perf-node-gather-daemonset-jj2tt\" (UID: \"3b6caf3f-9eea-4715-b33b-120506a42163\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.187936 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.187923 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3b6caf3f-9eea-4715-b33b-120506a42163-proc\") pod \"perf-node-gather-daemonset-jj2tt\" (UID: \"3b6caf3f-9eea-4715-b33b-120506a42163\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.288727 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.288693 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3b6caf3f-9eea-4715-b33b-120506a42163-podres\") pod \"perf-node-gather-daemonset-jj2tt\" (UID: \"3b6caf3f-9eea-4715-b33b-120506a42163\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.288945 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.288737 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b6caf3f-9eea-4715-b33b-120506a42163-sys\") pod \"perf-node-gather-daemonset-jj2tt\" (UID: \"3b6caf3f-9eea-4715-b33b-120506a42163\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.288945 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.288757 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b675p\" (UniqueName: \"kubernetes.io/projected/3b6caf3f-9eea-4715-b33b-120506a42163-kube-api-access-b675p\") pod \"perf-node-gather-daemonset-jj2tt\" (UID: \"3b6caf3f-9eea-4715-b33b-120506a42163\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.288945 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.288802 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b6caf3f-9eea-4715-b33b-120506a42163-lib-modules\") pod \"perf-node-gather-daemonset-jj2tt\" (UID: \"3b6caf3f-9eea-4715-b33b-120506a42163\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.288945 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.288855 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3b6caf3f-9eea-4715-b33b-120506a42163-proc\") pod \"perf-node-gather-daemonset-jj2tt\" (UID: \"3b6caf3f-9eea-4715-b33b-120506a42163\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.288945 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.288873 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b6caf3f-9eea-4715-b33b-120506a42163-sys\") pod \"perf-node-gather-daemonset-jj2tt\" (UID: \"3b6caf3f-9eea-4715-b33b-120506a42163\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.288945 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.288903 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b6caf3f-9eea-4715-b33b-120506a42163-lib-modules\") pod \"perf-node-gather-daemonset-jj2tt\" (UID: \"3b6caf3f-9eea-4715-b33b-120506a42163\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.288945 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.288903 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3b6caf3f-9eea-4715-b33b-120506a42163-podres\") pod \"perf-node-gather-daemonset-jj2tt\" (UID: \"3b6caf3f-9eea-4715-b33b-120506a42163\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.288945 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.288940 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3b6caf3f-9eea-4715-b33b-120506a42163-proc\") pod \"perf-node-gather-daemonset-jj2tt\" (UID: \"3b6caf3f-9eea-4715-b33b-120506a42163\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.297670 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.297638 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b675p\" (UniqueName: \"kubernetes.io/projected/3b6caf3f-9eea-4715-b33b-120506a42163-kube-api-access-b675p\") pod \"perf-node-gather-daemonset-jj2tt\" (UID: \"3b6caf3f-9eea-4715-b33b-120506a42163\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.415431 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.415000 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:50.583231 ip-10-0-143-120 kubenswrapper[2583]: W0421 15:25:50.583197 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3b6caf3f_9eea_4715_b33b_120506a42163.slice/crio-24012debbd3eb9dfa358f9abaacf4945cd259959c6d1f9c8d75ede27cab1b663 WatchSource:0}: Error finding container 24012debbd3eb9dfa358f9abaacf4945cd259959c6d1f9c8d75ede27cab1b663: Status 404 returned error can't find the container with id 24012debbd3eb9dfa358f9abaacf4945cd259959c6d1f9c8d75ede27cab1b663 Apr 21 15:25:50.589312 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:50.589254 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt"] Apr 21 15:25:51.305155 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:51.305124 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" event={"ID":"3b6caf3f-9eea-4715-b33b-120506a42163","Type":"ContainerStarted","Data":"fc0c36f6bc016695b612b24cb3728768ed9ffa267f599a5aa7f496a8d43bed2e"} Apr 21 15:25:51.305596 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:51.305162 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" event={"ID":"3b6caf3f-9eea-4715-b33b-120506a42163","Type":"ContainerStarted","Data":"24012debbd3eb9dfa358f9abaacf4945cd259959c6d1f9c8d75ede27cab1b663"} Apr 21 15:25:51.305596 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:51.305313 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:51.325475 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:51.325433 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" podStartSLOduration=1.325419517 podStartE2EDuration="1.325419517s" podCreationTimestamp="2026-04-21 15:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:25:51.325013327 +0000 UTC m=+917.204785143" watchObservedRunningTime="2026-04-21 15:25:51.325419517 +0000 UTC m=+917.205191332" Apr 21 15:25:51.426290 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:51.426264 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pdvb8_6626260d-dd76-4a21-87fd-b126ca3a6aac/dns/0.log" Apr 21 15:25:51.448103 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:51.448077 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pdvb8_6626260d-dd76-4a21-87fd-b126ca3a6aac/kube-rbac-proxy/0.log" Apr 21 15:25:51.470383 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:51.470354 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-g64wd_572e9b4f-7cd2-40c1-9fe5-538bad3971a7/dns-node-resolver/0.log" Apr 21 15:25:51.890870 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:51.890834 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7968f9c975-jjfvf_53446b04-0fba-43df-ac3d-640ef2ec654d/registry/0.log" Apr 21 15:25:51.956041 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:51.956014 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bhzt8_f83ccf86-3867-4183-a1d2-d8fb6871e584/node-ca/0.log" Apr 21 15:25:53.013599 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:53.013570 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rczsl_db283e03-ba54-420d-b4b8-709634355db8/serve-healthcheck-canary/0.log" Apr 21 15:25:53.480536 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:53.480494 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8jpdv_c18cbd9b-856a-4074-9e0f-171624debb4f/kube-rbac-proxy/0.log" Apr 21 15:25:53.504455 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:53.504434 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8jpdv_c18cbd9b-856a-4074-9e0f-171624debb4f/exporter/0.log" Apr 21 15:25:53.529853 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:53.529829 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8jpdv_c18cbd9b-856a-4074-9e0f-171624debb4f/extractor/0.log" Apr 21 15:25:55.291737 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:55.291713 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-fkq8p_4808c1f2-e2ab-4d58-a329-b1d4007917a4/jobset-operator/0.log" Apr 21 15:25:57.320932 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:57.320074 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-jj2tt" Apr 21 15:25:59.907721 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:59.907652 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2z5r_2a989bd5-4d7c-4917-b441-576b61407d76/kube-multus-additional-cni-plugins/0.log" Apr 21 15:25:59.934520 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:59.934492 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2z5r_2a989bd5-4d7c-4917-b441-576b61407d76/egress-router-binary-copy/0.log" Apr 21 15:25:59.963445 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:25:59.963422 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2z5r_2a989bd5-4d7c-4917-b441-576b61407d76/cni-plugins/0.log" Apr 21 15:26:00.006547 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:00.006496 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2z5r_2a989bd5-4d7c-4917-b441-576b61407d76/bond-cni-plugin/0.log" Apr 21 15:26:00.043390 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:00.043365 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2z5r_2a989bd5-4d7c-4917-b441-576b61407d76/routeoverride-cni/0.log" Apr 21 15:26:00.075983 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:00.075956 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2z5r_2a989bd5-4d7c-4917-b441-576b61407d76/whereabouts-cni-bincopy/0.log" Apr 21 15:26:00.103860 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:00.103833 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2z5r_2a989bd5-4d7c-4917-b441-576b61407d76/whereabouts-cni/0.log" Apr 21 15:26:00.542212 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:00.542154 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j58rf_ecffb9c8-2d5e-409b-8013-126edd86ac8a/kube-multus/0.log" Apr 21 15:26:00.755333 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:00.755260 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wpmpv_3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad/network-metrics-daemon/0.log" Apr 21 15:26:00.783298 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:00.783271 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wpmpv_3cc02d04-6c8a-4b9f-a09b-5bdf11c8e8ad/kube-rbac-proxy/0.log" Apr 21 15:26:01.603293 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:01.603266 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bf2p7_e46ae269-acf9-41f8-bfa2-1d7fd1b27c47/ovn-controller/0.log" Apr 21 15:26:01.626217 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:01.626189 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bf2p7_e46ae269-acf9-41f8-bfa2-1d7fd1b27c47/ovn-acl-logging/0.log" Apr 21 15:26:01.651785 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:01.651751 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bf2p7_e46ae269-acf9-41f8-bfa2-1d7fd1b27c47/kube-rbac-proxy-node/0.log" Apr 21 15:26:01.672289 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:01.672263 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bf2p7_e46ae269-acf9-41f8-bfa2-1d7fd1b27c47/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 15:26:01.689481 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:01.689449 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bf2p7_e46ae269-acf9-41f8-bfa2-1d7fd1b27c47/northd/0.log" Apr 21 15:26:01.710288 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:01.710267 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bf2p7_e46ae269-acf9-41f8-bfa2-1d7fd1b27c47/nbdb/0.log" Apr 21 15:26:01.734797 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:01.734765 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bf2p7_e46ae269-acf9-41f8-bfa2-1d7fd1b27c47/sbdb/0.log" Apr 21 15:26:01.864941 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:01.864866 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bf2p7_e46ae269-acf9-41f8-bfa2-1d7fd1b27c47/ovnkube-controller/0.log" Apr 21 15:26:03.617001 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:03.616974 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bhk2h_a3c8ffdc-c1d9-4058-af4c-7ed9bd46d1ba/network-check-target-container/0.log" Apr 21 15:26:04.537786 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:04.537742 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-7n5n5_9b3e45df-578e-4456-b850-310c9d4a72fa/iptables-alerter/0.log" Apr 21 15:26:05.370946 ip-10-0-143-120 kubenswrapper[2583]: I0421 15:26:05.370911 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-68mrx_93dbff63-b3ad-4508-8f23-3d4394458b3b/tuned/0.log"