Apr 23 17:58:38.406947 ip-10-0-128-229 systemd[1]: Starting Kubernetes Kubelet... Apr 23 17:58:38.883139 ip-10-0-128-229 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:58:38.883139 ip-10-0-128-229 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 17:58:38.883139 ip-10-0-128-229 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:58:38.883139 ip-10-0-128-229 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 17:58:38.883139 ip-10-0-128-229 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:58:38.887323 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.887234 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 17:58:38.893261 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893232 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:58:38.893261 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893253 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:58:38.893261 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893257 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:58:38.893261 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893260 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:58:38.893261 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893263 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:58:38.893261 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893266 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:58:38.893261 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893269 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:58:38.893261 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893271 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893275 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893277 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893280 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893283 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893286 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893289 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893291 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893296 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893300 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893302 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893305 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893307 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893310 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893312 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893315 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893317 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893320 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893322 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:58:38.893556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893325 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893330 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893333 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893336 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893338 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893340 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893343 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893345 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893348 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893350 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893353 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893356 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893359 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893365 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893369 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893373 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893376 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893379 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893381 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893384 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893386 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:58:38.894004 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893389 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893392 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893394 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893397 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893400 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893402 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893405 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893408 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893411 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893413 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893418 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893422 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893425 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893427 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893430 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893432 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893435 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893438 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893441 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893444 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:58:38.894578 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893447 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893449 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893452 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893454 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893456 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893459 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893463 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893466 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893469 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893471 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893474 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893478 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893480 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893483 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893485 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893488 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893492 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893496 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893499 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:58:38.895067 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893930 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893936 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893939 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893942 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893945 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893948 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893951 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893954 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893957 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893960 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893963 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893965 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893968 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893971 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893973 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893976 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893979 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893981 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893984 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893987 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:58:38.895558 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893990 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893993 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893995 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.893998 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894001 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894004 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894006 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894009 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894012 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894015 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894018 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894020 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894023 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894027 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894030 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894033 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894035 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894038 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894040 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894042 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:58:38.896037 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894045 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894048 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894050 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894052 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894055 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894057 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894061 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894065 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894068 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894071 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894073 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894076 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894079 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894081 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894084 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894086 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894089 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894093 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894095 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:58:38.896555 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894097 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894100 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894102 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894105 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894107 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894110 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894112 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894115 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894118 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894120 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894123 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894125 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894128 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894130 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894132 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894135 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894137 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894141 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894143 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894146 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:58:38.897050 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894149 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894151 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894154 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894156 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894159 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894161 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894163 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894248 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894264 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894271 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894275 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894280 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894284 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894288 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894293 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894296 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894299 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894303 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894306 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894309 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894312 2577 flags.go:64] FLAG: --cgroup-root="" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894315 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894318 2577 flags.go:64] FLAG: --client-ca-file="" Apr 23 17:58:38.897556 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894321 2577 flags.go:64] FLAG: --cloud-config="" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894324 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894326 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894331 2577 flags.go:64] FLAG: --cluster-domain="" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894333 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894337 2577 flags.go:64] FLAG: --config-dir="" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894340 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894344 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894348 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894351 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894354 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894357 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894361 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894363 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894366 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894370 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894372 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894377 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894380 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894382 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894385 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894391 2577 flags.go:64] FLAG: --enable-server="true" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894394 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894399 2577 flags.go:64] FLAG: --event-burst="100" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894402 2577 flags.go:64] FLAG: --event-qps="50" Apr 23 17:58:38.898113 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894405 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894408 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894411 2577 flags.go:64] FLAG: --eviction-hard="" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894415 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894418 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894421 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894424 2577 flags.go:64] FLAG: --eviction-soft="" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894427 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894430 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894433 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894436 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894439 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894442 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894445 2577 flags.go:64] FLAG: --feature-gates="" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894449 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894452 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894454 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894458 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894461 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894464 2577 flags.go:64] FLAG: --help="false" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894466 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-128-229.ec2.internal" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894469 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894472 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 17:58:38.898729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894475 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894478 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894481 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894484 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894487 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894490 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894493 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894496 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894499 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894502 2577 flags.go:64] FLAG: --kube-reserved="" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894504 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894507 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894513 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894515 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894518 2577 flags.go:64] FLAG: --lock-file="" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894521 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894524 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894540 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894546 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894551 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894554 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894557 2577 flags.go:64] FLAG: --logging-format="text" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894560 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894563 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 17:58:38.899281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894566 2577 flags.go:64] FLAG: --manifest-url="" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894569 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894573 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894576 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894580 2577 flags.go:64] FLAG: --max-pods="110" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894583 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894586 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894589 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894592 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894595 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894598 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894601 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894609 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894612 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894615 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894619 2577 flags.go:64] FLAG: --pod-cidr="" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894622 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894627 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894630 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894633 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894637 2577 flags.go:64] FLAG: --port="10250" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894640 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894643 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06c80b2605996e069" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894647 2577 flags.go:64] FLAG: --qos-reserved="" Apr 23 17:58:38.899885 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894650 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894653 2577 flags.go:64] FLAG: --register-node="true" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894656 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894658 2577 flags.go:64] FLAG: --register-with-taints="" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894664 2577 flags.go:64] FLAG: --registry-burst="10" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894666 2577 flags.go:64] FLAG: --registry-qps="5" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894669 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894672 2577 flags.go:64] FLAG: --reserved-memory="" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894676 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894679 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894690 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894694 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894696 2577 flags.go:64] FLAG: --runonce="false" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894699 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894702 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894705 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894708 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894711 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894714 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894717 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894720 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894722 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894725 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894728 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894731 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894735 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 17:58:38.900457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894737 2577 flags.go:64] FLAG: --system-cgroups="" Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894740 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894747 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894750 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894753 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894757 2577 flags.go:64] FLAG: --tls-min-version="" Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894760 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894762 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894765 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894768 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894772 2577 flags.go:64] FLAG: --v="2" Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894776 2577 flags.go:64] FLAG: --version="false" Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894780 2577 flags.go:64] FLAG: --vmodule="" Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894785 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.894788 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894885 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894888 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894891 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894894 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894896 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894899 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894902 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894905 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:58:38.901103 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894907 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894909 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894912 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894914 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894917 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894919 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894922 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894924 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894927 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894930 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894932 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894936 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894938 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894941 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894943 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894946 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894948 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894951 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894953 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894958 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:58:38.901787 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894961 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894964 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894966 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894969 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894971 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894974 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894976 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894980 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894984 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894986 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894989 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894992 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894995 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894997 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.894999 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895002 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895004 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895007 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895009 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:58:38.902734 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895012 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895014 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895017 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895020 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895024 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895026 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895028 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895031 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895033 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895036 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895038 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895041 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895046 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895050 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895053 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895056 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895059 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895061 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895064 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:58:38.903582 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895067 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895070 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895072 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895075 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895078 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895081 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895083 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895086 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895088 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895091 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895093 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895096 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895099 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895101 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895103 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895106 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895109 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895116 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895119 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:58:38.904113 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.895121 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:58:38.904581 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.896002 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:58:38.904713 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.904688 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 17:58:38.904771 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.904717 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 17:58:38.904818 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904798 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:58:38.904818 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904808 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:58:38.904818 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904813 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:58:38.904818 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904818 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904824 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904830 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904835 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904840 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904844 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904851 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904857 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904862 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904868 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904875 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904879 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904884 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904889 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904893 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904898 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904902 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904906 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:58:38.904997 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904910 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904914 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904919 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904922 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904928 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904932 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904936 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904940 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904944 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904948 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904953 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904957 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904961 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904965 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904969 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904974 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904979 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904983 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904987 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904991 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:58:38.905813 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.904995 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905000 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905004 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905008 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905012 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905016 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905021 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905025 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905029 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905033 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905037 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905041 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905045 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905049 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905053 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905057 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905061 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905067 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905071 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:58:38.906364 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905075 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905079 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905083 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905087 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905091 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905095 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905099 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905103 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905108 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905112 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905117 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905121 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905125 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905129 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905134 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905138 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905142 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905146 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905150 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905154 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:58:38.906957 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905158 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:58:38.907566 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905162 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:58:38.907566 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905168 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:58:38.907566 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905172 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:58:38.907566 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905176 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:58:38.907566 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905180 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:58:38.907566 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.905188 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:58:38.907566 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905391 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:58:38.907566 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905401 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:58:38.907566 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905408 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:58:38.907566 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905415 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:58:38.907566 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905423 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:58:38.907566 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905428 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:58:38.907566 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905432 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:58:38.907566 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905437 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:58:38.907566 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905442 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905446 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905451 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905456 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905460 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905465 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905469 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905474 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905478 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905483 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905487 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905492 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905496 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905500 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905504 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905508 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905512 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905516 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905520 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905524 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:58:38.908235 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905544 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905548 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905552 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905555 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905560 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905564 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905568 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905572 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905577 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905582 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905588 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905593 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905597 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905601 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905605 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905609 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905613 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905618 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905623 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905627 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:58:38.908749 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905631 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905635 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905639 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905643 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905647 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905651 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905655 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905660 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905664 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905669 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905673 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905677 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905681 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905685 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905689 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905693 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905697 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905701 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905705 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:58:38.909385 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905709 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905713 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905717 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905722 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905726 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905730 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905734 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905738 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905742 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905746 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905750 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905755 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905759 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905764 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905768 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905772 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905775 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905779 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:58:38.910016 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:38.905784 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:58:38.910504 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.905792 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:58:38.910504 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.906696 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 17:58:38.911167 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.911151 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 17:58:38.912293 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.912280 2577 server.go:1019] "Starting client certificate rotation" Apr 23 17:58:38.912400 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.912381 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:58:38.912441 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.912427 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:58:38.939376 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.939352 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:58:38.943894 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.943869 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:58:38.963871 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.963850 2577 log.go:25] "Validated CRI v1 runtime API" Apr 23 17:58:38.969945 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.969921 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:58:38.969945 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.969937 2577 log.go:25] "Validated CRI v1 image API" Apr 23 17:58:38.972177 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.972156 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 17:58:38.976552 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.976499 2577 fs.go:135] Filesystem UUIDs: map[5276e240-66e4-4e0b-bd3e-e1cce3d9f36c:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 853dd110-3510-4d1e-82b5-d1cd8e554c03:/dev/nvme0n1p3] Apr 23 17:58:38.976633 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.976524 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 17:58:38.982719 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.982549 2577 manager.go:217] Machine: {Timestamp:2026-04-23 17:58:38.980452816 +0000 UTC m=+0.447047615 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101911 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec274dd94dbb843fe4195f01fcb57e7f SystemUUID:ec274dd9-4dbb-843f-e419-5f01fcb57e7f BootID:76a6dd79-51a1-4070-94bf-93179dc9252f Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:2f:49:99:3f:a3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:2f:49:99:3f:a3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:7d:df:a5:d6:d0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 17:58:38.982719 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.982705 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 17:58:38.982915 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.982895 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 17:58:38.984064 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.984037 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 17:58:38.984245 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.984066 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-229.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 17:58:38.984319 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.984260 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 17:58:38.984319 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.984274 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 17:58:38.984319 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.984293 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:58:38.984319 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.984311 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:58:38.985627 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.985612 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:58:38.985938 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.985926 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 17:58:38.988655 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.988643 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 23 17:58:38.988729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.988665 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 17:58:38.988729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.988684 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 17:58:38.988729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.988698 2577 kubelet.go:397] "Adding apiserver pod source" Apr 23 17:58:38.988729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.988711 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 17:58:38.989870 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.989857 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:58:38.989941 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.989883 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:58:38.993565 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.993410 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 17:58:38.994832 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.994812 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7vtx4" Apr 23 17:58:38.995774 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.995758 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 17:58:38.997349 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.997328 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 17:58:38.997412 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.997371 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 17:58:38.997412 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.997379 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 17:58:38.997412 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.997386 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 17:58:38.997412 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.997392 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 17:58:38.997412 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.997399 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 17:58:38.997412 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.997405 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 17:58:38.997412 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.997410 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 17:58:38.997618 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.997418 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 17:58:38.997618 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.997424 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 17:58:38.997618 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.997432 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 17:58:38.997618 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.997442 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 17:58:38.998377 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.998368 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 17:58:38.998448 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:38.998380 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 17:58:38.998589 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:38.998570 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-229.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:58:38.998635 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:38.998573 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:58:39.002221 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.002207 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 17:58:39.002299 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.002247 2577 server.go:1295] "Started kubelet" Apr 23 17:58:39.002388 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.002348 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 17:58:39.002426 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.002391 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 17:58:39.002456 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.002442 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 17:58:39.003236 ip-10-0-128-229 systemd[1]: Started Kubernetes Kubelet. Apr 23 17:58:39.004114 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.004094 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 17:58:39.004522 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.004502 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7vtx4" Apr 23 17:58:39.005450 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.005432 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 23 17:58:39.009736 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.009718 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 17:58:39.010180 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.010166 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 17:58:39.010816 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.010798 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 17:58:39.010816 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.010805 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 17:58:39.010922 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.010827 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 17:58:39.010955 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.010928 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 23 17:58:39.010955 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.010936 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 23 17:58:39.011072 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.011058 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-229.ec2.internal\" not found" Apr 23 17:58:39.013829 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.013631 2577 factory.go:55] Registering systemd factory Apr 23 17:58:39.013829 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.013657 2577 factory.go:223] Registration of the systemd container factory successfully Apr 23 17:58:39.014010 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.013996 2577 factory.go:153] Registering CRI-O factory Apr 23 17:58:39.014087 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.014014 2577 factory.go:223] Registration of the crio container factory successfully Apr 23 17:58:39.014087 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.014083 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 17:58:39.014183 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.014109 2577 factory.go:103] Registering Raw factory Apr 23 17:58:39.014183 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.014125 2577 manager.go:1196] Started watching for new ooms in manager Apr 23 17:58:39.014743 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.014724 2577 manager.go:319] Starting recovery of all containers Apr 23 17:58:39.014906 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.014883 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 17:58:39.016592 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.015737 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:39.019283 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.019258 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-229.ec2.internal\" not found" node="ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.021401 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.021379 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-229.ec2.internal" not found Apr 23 17:58:39.026850 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.026833 2577 manager.go:324] Recovery completed Apr 23 17:58:39.030818 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.030804 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:39.033410 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.033394 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:39.033480 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.033432 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:39.033480 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.033442 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:39.033995 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.033983 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 17:58:39.033995 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.033993 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 17:58:39.034083 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.034008 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:58:39.036267 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.036254 2577 policy_none.go:49] "None policy: Start" Apr 23 17:58:39.036267 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.036270 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 17:58:39.036362 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.036279 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 23 17:58:39.039985 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.039960 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-229.ec2.internal" not found Apr 23 17:58:39.073951 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.073932 2577 manager.go:341] "Starting Device Plugin manager" Apr 23 17:58:39.074045 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.074003 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 17:58:39.074045 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.074013 2577 server.go:85] "Starting device plugin registration server" Apr 23 17:58:39.074214 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.074204 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 17:58:39.074245 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.074217 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 17:58:39.074750 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.074505 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 17:58:39.074750 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.074611 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 17:58:39.074750 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.074620 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 17:58:39.075833 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.075587 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 17:58:39.075833 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.075622 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-229.ec2.internal\" not found" Apr 23 17:58:39.099960 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.099941 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-229.ec2.internal" not found Apr 23 17:58:39.155702 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.155610 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 17:58:39.157074 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.157054 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 17:58:39.157074 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.157079 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 17:58:39.157275 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.157122 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 17:58:39.157275 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.157133 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 17:58:39.157275 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.157168 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 17:58:39.159000 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.158984 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:39.175235 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.175207 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:39.176140 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.176123 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:39.176227 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.176153 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:39.176227 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.176162 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:39.176227 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.176187 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.185287 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.185269 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.185353 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.185297 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-229.ec2.internal\": node \"ip-10-0-128-229.ec2.internal\" not found" Apr 23 17:58:39.201895 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.201850 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-229.ec2.internal\" not found" Apr 23 17:58:39.258247 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.258209 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-229.ec2.internal"] Apr 23 17:58:39.258372 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.258308 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:39.259579 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.259564 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:39.259646 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.259594 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:39.259646 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.259604 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:39.261041 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.261027 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:39.261197 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.261183 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.261244 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.261212 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:39.261920 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.261903 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:39.262013 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.261934 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:39.262013 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.261904 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:39.262013 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.261971 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:39.262013 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.261947 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:39.262013 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.261996 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:39.263095 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.263082 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.263155 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.263116 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:39.263791 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.263775 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:39.263859 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.263802 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:39.263859 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.263812 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:39.277116 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.277093 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-229.ec2.internal\" not found" node="ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.281001 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.280979 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-229.ec2.internal\" not found" node="ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.302948 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.302928 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-229.ec2.internal\" not found" Apr 23 17:58:39.312478 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.312456 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/43b465d4e8bfce095d9d53677dbda72a-config\") pod \"kube-apiserver-proxy-ip-10-0-128-229.ec2.internal\" (UID: \"43b465d4e8bfce095d9d53677dbda72a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.312589 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.312481 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/edf64edd20dcf0bca603d270a7bdadff-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal\" (UID: \"edf64edd20dcf0bca603d270a7bdadff\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.312589 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.312506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edf64edd20dcf0bca603d270a7bdadff-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal\" (UID: \"edf64edd20dcf0bca603d270a7bdadff\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.403404 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.403368 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-229.ec2.internal\" not found" Apr 23 17:58:39.412792 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.412719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/edf64edd20dcf0bca603d270a7bdadff-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal\" (UID: \"edf64edd20dcf0bca603d270a7bdadff\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.412792 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.412765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edf64edd20dcf0bca603d270a7bdadff-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal\" (UID: \"edf64edd20dcf0bca603d270a7bdadff\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.412937 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.412813 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/edf64edd20dcf0bca603d270a7bdadff-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal\" (UID: \"edf64edd20dcf0bca603d270a7bdadff\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.412937 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.412820 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/43b465d4e8bfce095d9d53677dbda72a-config\") pod \"kube-apiserver-proxy-ip-10-0-128-229.ec2.internal\" (UID: \"43b465d4e8bfce095d9d53677dbda72a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.412937 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.412833 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edf64edd20dcf0bca603d270a7bdadff-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal\" (UID: \"edf64edd20dcf0bca603d270a7bdadff\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.412937 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.412851 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/43b465d4e8bfce095d9d53677dbda72a-config\") pod \"kube-apiserver-proxy-ip-10-0-128-229.ec2.internal\" (UID: \"43b465d4e8bfce095d9d53677dbda72a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.504163 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.504137 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-229.ec2.internal\" not found" Apr 23 17:58:39.578635 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.578602 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.583435 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.583409 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-229.ec2.internal" Apr 23 17:58:39.604275 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.604245 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-229.ec2.internal\" not found" Apr 23 17:58:39.704790 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.704702 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-229.ec2.internal\" not found" Apr 23 17:58:39.805147 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.805116 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-229.ec2.internal\" not found" Apr 23 17:58:39.824391 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.824355 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:39.906152 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:39.906116 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-229.ec2.internal\" not found" Apr 23 17:58:39.912407 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.912386 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 17:58:39.912613 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.912578 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:58:39.912679 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.912582 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:58:39.912679 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:39.912608 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:58:40.007050 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:40.007015 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-229.ec2.internal\" not found" Apr 23 17:58:40.007050 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.007028 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 17:53:38 +0000 UTC" deadline="2028-01-15 04:10:04.879983474 +0000 UTC" Apr 23 17:58:40.007249 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.007060 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15154h11m24.872927691s" Apr 23 17:58:40.010213 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.010181 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 17:58:40.030196 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.030165 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:58:40.055330 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.055303 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hj7d4" Apr 23 17:58:40.060575 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.060548 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hj7d4" Apr 23 17:58:40.073901 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:40.073211 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedf64edd20dcf0bca603d270a7bdadff.slice/crio-b4f97e68d66970eb2a9bb7517ad971b159f60744f8a0e4e3daa166bbec3ec01b WatchSource:0}: Error finding container b4f97e68d66970eb2a9bb7517ad971b159f60744f8a0e4e3daa166bbec3ec01b: Status 404 returned error can't find the container with id b4f97e68d66970eb2a9bb7517ad971b159f60744f8a0e4e3daa166bbec3ec01b Apr 23 17:58:40.074489 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:40.074466 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43b465d4e8bfce095d9d53677dbda72a.slice/crio-31edcc63fda63b41cf3d43140fc6414db3df773561b3a1e3e83628fdb7657093 WatchSource:0}: Error finding container 31edcc63fda63b41cf3d43140fc6414db3df773561b3a1e3e83628fdb7657093: Status 404 returned error can't find the container with id 31edcc63fda63b41cf3d43140fc6414db3df773561b3a1e3e83628fdb7657093 Apr 23 17:58:40.077959 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.077945 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:58:40.107801 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:40.107771 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-229.ec2.internal\" not found" Apr 23 17:58:40.160100 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.160046 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-229.ec2.internal" event={"ID":"43b465d4e8bfce095d9d53677dbda72a","Type":"ContainerStarted","Data":"31edcc63fda63b41cf3d43140fc6414db3df773561b3a1e3e83628fdb7657093"} Apr 23 17:58:40.160934 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.160913 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal" event={"ID":"edf64edd20dcf0bca603d270a7bdadff","Type":"ContainerStarted","Data":"b4f97e68d66970eb2a9bb7517ad971b159f60744f8a0e4e3daa166bbec3ec01b"} Apr 23 17:58:40.201569 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.201545 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:40.210646 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.210625 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-229.ec2.internal" Apr 23 17:58:40.221424 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.221365 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:58:40.222380 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.222368 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal" Apr 23 17:58:40.232582 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.232560 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:58:40.818704 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.818664 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:40.990768 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.990726 2577 apiserver.go:52] "Watching apiserver" Apr 23 17:58:40.999443 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.999408 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 17:58:40.999822 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:40.999799 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xfxr2","kube-system/kube-apiserver-proxy-ip-10-0-128-229.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6","openshift-multus/multus-7wxpl","openshift-multus/multus-additional-cni-plugins-p7mrf","openshift-network-diagnostics/network-check-target-8c8cd","kube-system/konnectivity-agent-bkbbs","openshift-cluster-node-tuning-operator/tuned-zb5m5","openshift-dns/node-resolver-799b7","openshift-image-registry/node-ca-9z6qm","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal","openshift-multus/network-metrics-daemon-g6b5k","openshift-network-operator/iptables-alerter-nnpn7"] Apr 23 17:58:41.001295 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.001264 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bkbbs" Apr 23 17:58:41.002462 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.002439 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.003597 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.003579 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.004228 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.004204 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 17:58:41.004338 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.004270 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 17:58:41.004338 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.004283 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-j66km\"" Apr 23 17:58:41.005097 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.005079 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mm2md\"" Apr 23 17:58:41.005378 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.005358 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 17:58:41.005378 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.005374 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 17:58:41.006161 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.006142 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:41.006254 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:41.006223 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:58:41.006335 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.006321 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 17:58:41.006624 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.006582 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 17:58:41.006790 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.006770 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 17:58:41.007057 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.007042 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mqqwn\"" Apr 23 17:58:41.007242 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.007226 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 17:58:41.008475 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.007571 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 17:58:41.008475 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.007815 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.009672 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.009653 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.012336 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.012314 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 17:58:41.012510 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.012494 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-75hp2\"" Apr 23 17:58:41.012767 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.012747 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 17:58:41.013030 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.013016 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 17:58:41.013105 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.013081 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 17:58:41.013344 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.013331 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 17:58:41.013417 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.013402 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-799b7" Apr 23 17:58:41.013658 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.013638 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 17:58:41.014168 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.014149 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-przhk\"" Apr 23 17:58:41.014280 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.014261 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 17:58:41.015310 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.015290 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9z6qm" Apr 23 17:58:41.017361 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.016860 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-5gfcv\"" Apr 23 17:58:41.017361 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.016904 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 17:58:41.017361 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.016972 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.017514 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.017399 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 17:58:41.017514 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.017409 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:58:41.017694 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.017656 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 17:58:41.017868 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.017849 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 17:58:41.017961 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.017885 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 17:58:41.018027 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.018006 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wpjkn\"" Apr 23 17:58:41.018280 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.018264 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:41.018402 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.018385 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nnpn7" Apr 23 17:58:41.018630 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:41.018383 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:58:41.019178 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.019157 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 17:58:41.019514 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.019499 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 17:58:41.019617 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.019518 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-2wb4q\"" Apr 23 17:58:41.019685 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.019612 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-socket-dir\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.019685 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.019649 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-cnibin\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.019794 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.019698 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-run-netns\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.019794 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.019725 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-run\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.019794 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.019749 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-system-cni-dir\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.019794 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.019773 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-var-lib-kubelet\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.020035 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.019834 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-run-ovn\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.020035 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.019872 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/23fb900a-d97c-474d-b2b4-649024dc77d9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.020035 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.019893 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdqrn\" (UniqueName: \"kubernetes.io/projected/23fb900a-d97c-474d-b2b4-649024dc77d9-kube-api-access-mdqrn\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.020035 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.019910 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/554e663c-f1ef-46e3-bfb8-dc66a756354c-env-overrides\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.020035 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.019960 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/554e663c-f1ef-46e3-bfb8-dc66a756354c-ovnkube-script-lib\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.020035 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.019989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-host\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.020035 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.020365 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-run-k8s-cni-cncf-io\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.020365 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020063 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23fb900a-d97c-474d-b2b4-649024dc77d9-system-cni-dir\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.020365 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020088 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-run-openvswitch\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.020365 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020117 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23fb900a-d97c-474d-b2b4-649024dc77d9-cni-binary-copy\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.020365 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-cni-bin\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.020365 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020163 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-kubernetes\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.020365 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020188 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-multus-conf-dir\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.020365 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020214 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a5556727-cb66-4227-ad85-c113c4a3cd70-multus-daemon-config\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.020365 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020230 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-run-multus-certs\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.020365 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020247 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-run-ovn-kubernetes\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.020365 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020263 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/554e663c-f1ef-46e3-bfb8-dc66a756354c-ovn-node-metrics-cert\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.020365 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020300 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tscl8\" (UniqueName: \"kubernetes.io/projected/554e663c-f1ef-46e3-bfb8-dc66a756354c-kube-api-access-tscl8\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.020365 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020337 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2791436a-c956-4bbf-81a8-9cf1dff161c4-agent-certs\") pod \"konnectivity-agent-bkbbs\" (UID: \"2791436a-c956-4bbf-81a8-9cf1dff161c4\") " pod="kube-system/konnectivity-agent-bkbbs" Apr 23 17:58:41.020365 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020353 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-var-lib-openvswitch\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.020365 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020367 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/554e663c-f1ef-46e3-bfb8-dc66a756354c-ovnkube-config\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020380 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-sysctl-d\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-systemd\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020410 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-var-lib-kubelet\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020424 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-os-release\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020436 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-slash\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020451 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-node-log\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020478 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-hostroot\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23fb900a-d97c-474d-b2b4-649024dc77d9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020547 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020523 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9nzw\" (UniqueName: \"kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw\") pod \"network-check-target-8c8cd\" (UID: \"4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac\") " pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020588 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22139571-cd15-4427-b88f-162e4339848d-tmp\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020610 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-registration-dir\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020635 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrmws\" (UniqueName: \"kubernetes.io/projected/9eb4f421-178c-437b-9788-b5d35c25462a-kube-api-access-qrmws\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-var-lib-cni-bin\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020672 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-sys\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020690 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-etc-openvswitch\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.021175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020709 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020729 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-kubelet\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020767 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-run-systemd\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020788 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23fb900a-d97c-474d-b2b4-649024dc77d9-cnibin\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020818 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-sys-fs\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020849 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020851 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-run-netns\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020866 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-zljhc\"" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020874 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmvtg\" (UniqueName: \"kubernetes.io/projected/a5556727-cb66-4227-ad85-c113c4a3cd70-kube-api-access-gmvtg\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020912 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-cni-netd\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020935 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/23fb900a-d97c-474d-b2b4-649024dc77d9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020953 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20-tmp-dir\") pod \"node-resolver-799b7\" (UID: \"39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20\") " pod="openshift-dns/node-resolver-799b7" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020972 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/22139571-cd15-4427-b88f-162e4339848d-etc-tuned\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.020990 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2791436a-c956-4bbf-81a8-9cf1dff161c4-konnectivity-ca\") pod \"konnectivity-agent-bkbbs\" (UID: \"2791436a-c956-4bbf-81a8-9cf1dff161c4\") " pod="kube-system/konnectivity-agent-bkbbs" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021011 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021020 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a5556727-cb66-4227-ad85-c113c4a3cd70-cni-binary-copy\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-log-socket\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021094 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-etc-kubernetes\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.021867 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021128 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20-hosts-file\") pod \"node-resolver-799b7\" (UID: \"39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20\") " pod="openshift-dns/node-resolver-799b7" Apr 23 17:58:41.022592 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021152 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml8vj\" (UniqueName: \"kubernetes.io/projected/39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20-kube-api-access-ml8vj\") pod \"node-resolver-799b7\" (UID: \"39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20\") " pod="openshift-dns/node-resolver-799b7" Apr 23 17:58:41.022592 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-sysconfig\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.022592 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021187 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-sysctl-conf\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.022592 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021205 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-etc-selinux\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.022592 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021227 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-multus-cni-dir\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.022592 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021249 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-var-lib-cni-multus\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.022592 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021290 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-modprobe-d\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.022592 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021334 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p6d7\" (UniqueName: \"kubernetes.io/projected/22139571-cd15-4427-b88f-162e4339848d-kube-api-access-8p6d7\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.022592 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021360 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-device-dir\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.022592 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021384 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-multus-socket-dir-parent\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.022592 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-systemd-units\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.022592 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021412 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23fb900a-d97c-474d-b2b4-649024dc77d9-os-release\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.022592 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.021426 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-lib-modules\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.061229 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.061202 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:53:40 +0000 UTC" deadline="2027-12-11 06:56:44.765846277 +0000 UTC" Apr 23 17:58:41.061229 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.061225 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14316h58m3.704622914s" Apr 23 17:58:41.112043 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.111954 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 17:58:41.122663 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.122630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-modprobe-d\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.122663 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.122685 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6d7\" (UniqueName: \"kubernetes.io/projected/22139571-cd15-4427-b88f-162e4339848d-kube-api-access-8p6d7\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.122912 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.122710 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-device-dir\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.122912 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.122777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-multus-socket-dir-parent\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.122912 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.122776 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-modprobe-d\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.122912 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.122825 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-systemd-units\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.122912 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.122879 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd8q6\" (UniqueName: \"kubernetes.io/projected/f0d9878b-7280-4232-a0c9-247ed15ce7a8-kube-api-access-jd8q6\") pod \"iptables-alerter-nnpn7\" (UID: \"f0d9878b-7280-4232-a0c9-247ed15ce7a8\") " pod="openshift-network-operator/iptables-alerter-nnpn7" Apr 23 17:58:41.123138 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.122928 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23fb900a-d97c-474d-b2b4-649024dc77d9-os-release\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.123138 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.122956 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-lib-modules\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.123138 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.122981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-socket-dir\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.123138 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123007 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-cnibin\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.123138 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-run-netns\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.123370 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123139 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6296d6a8-88ab-4cde-8f0b-ca707b8e5b51-serviceca\") pod \"node-ca-9z6qm\" (UID: \"6296d6a8-88ab-4cde-8f0b-ca707b8e5b51\") " pod="openshift-image-registry/node-ca-9z6qm" Apr 23 17:58:41.123370 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123150 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-multus-socket-dir-parent\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.123370 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.122781 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-device-dir\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.123370 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-run\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.123370 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-system-cni-dir\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.123370 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123231 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-systemd-units\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.123370 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-var-lib-kubelet\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.123370 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-run-ovn\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.123370 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123285 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-run\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.123370 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123308 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-socket-dir\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.123370 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123325 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqh5z\" (UniqueName: \"kubernetes.io/projected/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-kube-api-access-cqh5z\") pod \"network-metrics-daemon-g6b5k\" (UID: \"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa\") " pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:41.123370 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123354 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/23fb900a-d97c-474d-b2b4-649024dc77d9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.123370 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123366 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-cnibin\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123378 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23fb900a-d97c-474d-b2b4-649024dc77d9-os-release\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123383 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdqrn\" (UniqueName: \"kubernetes.io/projected/23fb900a-d97c-474d-b2b4-649024dc77d9-kube-api-access-mdqrn\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123326 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-run-netns\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123255 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-var-lib-kubelet\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123399 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-lib-modules\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123409 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/554e663c-f1ef-46e3-bfb8-dc66a756354c-env-overrides\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123444 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-system-cni-dir\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123462 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/554e663c-f1ef-46e3-bfb8-dc66a756354c-ovnkube-script-lib\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-run-ovn\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123499 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-host\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123678 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-run-k8s-cni-cncf-io\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123731 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23fb900a-d97c-474d-b2b4-649024dc77d9-system-cni-dir\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-run-openvswitch\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123849 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0d9878b-7280-4232-a0c9-247ed15ce7a8-host-slash\") pod \"iptables-alerter-nnpn7\" (UID: \"f0d9878b-7280-4232-a0c9-247ed15ce7a8\") " pod="openshift-network-operator/iptables-alerter-nnpn7" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123919 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-run-k8s-cni-cncf-io\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.123996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23fb900a-d97c-474d-b2b4-649024dc77d9-cni-binary-copy\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.123981 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23fb900a-d97c-474d-b2b4-649024dc77d9-system-cni-dir\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124016 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-cni-bin\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-kubernetes\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124048 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-run-openvswitch\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124083 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/554e663c-f1ef-46e3-bfb8-dc66a756354c-env-overrides\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124103 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-kubernetes\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124095 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-multus-conf-dir\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124138 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/554e663c-f1ef-46e3-bfb8-dc66a756354c-ovnkube-script-lib\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124155 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-host\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124176 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/23fb900a-d97c-474d-b2b4-649024dc77d9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124185 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-cni-bin\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a5556727-cb66-4227-ad85-c113c4a3cd70-multus-daemon-config\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124206 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-multus-conf-dir\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124247 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-run-multus-certs\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124285 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-run-ovn-kubernetes\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124318 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-run-multus-certs\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.124688 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124313 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/554e663c-f1ef-46e3-bfb8-dc66a756354c-ovn-node-metrics-cert\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124361 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tscl8\" (UniqueName: \"kubernetes.io/projected/554e663c-f1ef-46e3-bfb8-dc66a756354c-kube-api-access-tscl8\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124361 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-run-ovn-kubernetes\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124431 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgjp2\" (UniqueName: \"kubernetes.io/projected/6296d6a8-88ab-4cde-8f0b-ca707b8e5b51-kube-api-access-kgjp2\") pod \"node-ca-9z6qm\" (UID: \"6296d6a8-88ab-4cde-8f0b-ca707b8e5b51\") " pod="openshift-image-registry/node-ca-9z6qm" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124445 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2791436a-c956-4bbf-81a8-9cf1dff161c4-agent-certs\") pod \"konnectivity-agent-bkbbs\" (UID: \"2791436a-c956-4bbf-81a8-9cf1dff161c4\") " pod="kube-system/konnectivity-agent-bkbbs" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124495 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-var-lib-openvswitch\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124578 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-var-lib-openvswitch\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124631 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23fb900a-d97c-474d-b2b4-649024dc77d9-cni-binary-copy\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/554e663c-f1ef-46e3-bfb8-dc66a756354c-ovnkube-config\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-sysctl-d\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124722 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-systemd\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124749 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-var-lib-kubelet\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-os-release\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-slash\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124820 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-node-log\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124830 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-sysctl-d\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.125329 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124798 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-systemd\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124850 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs\") pod \"network-metrics-daemon-g6b5k\" (UID: \"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa\") " pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124836 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124893 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-os-release\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-node-log\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124888 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-hostroot\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124931 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23fb900a-d97c-474d-b2b4-649024dc77d9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124935 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-slash\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124940 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-var-lib-kubelet\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124976 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-hostroot\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.124971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9nzw\" (UniqueName: \"kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw\") pod \"network-check-target-8c8cd\" (UID: \"4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac\") " pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.125010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22139571-cd15-4427-b88f-162e4339848d-tmp\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.125035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-registration-dir\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.125059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23fb900a-d97c-474d-b2b4-649024dc77d9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.125526 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-registration-dir\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.125870 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a5556727-cb66-4227-ad85-c113c4a3cd70-multus-daemon-config\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.125901 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrmws\" (UniqueName: \"kubernetes.io/projected/9eb4f421-178c-437b-9788-b5d35c25462a-kube-api-access-qrmws\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.125962 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-var-lib-cni-bin\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.126258 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.125990 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-sys\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.126020 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-etc-openvswitch\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.126050 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.126082 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6296d6a8-88ab-4cde-8f0b-ca707b8e5b51-host\") pod \"node-ca-9z6qm\" (UID: \"6296d6a8-88ab-4cde-8f0b-ca707b8e5b51\") " pod="openshift-image-registry/node-ca-9z6qm" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.126402 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-var-lib-cni-bin\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.126474 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-sys\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.126556 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-etc-openvswitch\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.126604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.126690 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/554e663c-f1ef-46e3-bfb8-dc66a756354c-ovnkube-config\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.126840 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-kubelet\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.126887 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-run-systemd\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.126928 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-kubelet\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.126933 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f0d9878b-7280-4232-a0c9-247ed15ce7a8-iptables-alerter-script\") pod \"iptables-alerter-nnpn7\" (UID: \"f0d9878b-7280-4232-a0c9-247ed15ce7a8\") " pod="openshift-network-operator/iptables-alerter-nnpn7" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.126991 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-run-systemd\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127002 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23fb900a-d97c-474d-b2b4-649024dc77d9-cnibin\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127049 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-sys-fs\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.127079 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23fb900a-d97c-474d-b2b4-649024dc77d9-cnibin\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127091 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-run-netns\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmvtg\" (UniqueName: \"kubernetes.io/projected/a5556727-cb66-4227-ad85-c113c4a3cd70-kube-api-access-gmvtg\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127143 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-run-netns\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127213 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-sys-fs\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-cni-netd\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127251 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-host-cni-netd\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/23fb900a-d97c-474d-b2b4-649024dc77d9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20-tmp-dir\") pod \"node-resolver-799b7\" (UID: \"39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20\") " pod="openshift-dns/node-resolver-799b7" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/22139571-cd15-4427-b88f-162e4339848d-etc-tuned\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2791436a-c956-4bbf-81a8-9cf1dff161c4-konnectivity-ca\") pod \"konnectivity-agent-bkbbs\" (UID: \"2791436a-c956-4bbf-81a8-9cf1dff161c4\") " pod="kube-system/konnectivity-agent-bkbbs" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a5556727-cb66-4227-ad85-c113c4a3cd70-cni-binary-copy\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-log-socket\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-etc-kubernetes\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127600 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20-hosts-file\") pod \"node-resolver-799b7\" (UID: \"39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20\") " pod="openshift-dns/node-resolver-799b7" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ml8vj\" (UniqueName: \"kubernetes.io/projected/39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20-kube-api-access-ml8vj\") pod \"node-resolver-799b7\" (UID: \"39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20\") " pod="openshift-dns/node-resolver-799b7" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-sysconfig\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127692 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20-tmp-dir\") pod \"node-resolver-799b7\" (UID: \"39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20\") " pod="openshift-dns/node-resolver-799b7" Apr 23 17:58:41.131011 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127714 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-sysctl-conf\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127782 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-etc-kubernetes\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.127914 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/23fb900a-d97c-474d-b2b4-649024dc77d9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.128023 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20-hosts-file\") pod \"node-resolver-799b7\" (UID: \"39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20\") " pod="openshift-dns/node-resolver-799b7" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.128078 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/554e663c-f1ef-46e3-bfb8-dc66a756354c-log-socket\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.128148 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-sysconfig\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.128303 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a5556727-cb66-4227-ad85-c113c4a3cd70-cni-binary-copy\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.128251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-etc-selinux\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.128702 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/554e663c-f1ef-46e3-bfb8-dc66a756354c-ovn-node-metrics-cert\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.128796 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9eb4f421-178c-437b-9788-b5d35c25462a-etc-selinux\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.128844 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-multus-cni-dir\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.128879 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2791436a-c956-4bbf-81a8-9cf1dff161c4-agent-certs\") pod \"konnectivity-agent-bkbbs\" (UID: \"2791436a-c956-4bbf-81a8-9cf1dff161c4\") " pod="kube-system/konnectivity-agent-bkbbs" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.128901 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-var-lib-cni-multus\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.128968 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2791436a-c956-4bbf-81a8-9cf1dff161c4-konnectivity-ca\") pod \"konnectivity-agent-bkbbs\" (UID: \"2791436a-c956-4bbf-81a8-9cf1dff161c4\") " pod="kube-system/konnectivity-agent-bkbbs" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.129007 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-host-var-lib-cni-multus\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.129059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5556727-cb66-4227-ad85-c113c4a3cd70-multus-cni-dir\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.129166 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/22139571-cd15-4427-b88f-162e4339848d-etc-sysctl-conf\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.129897 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22139571-cd15-4427-b88f-162e4339848d-tmp\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.131651 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.130810 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/22139571-cd15-4427-b88f-162e4339848d-etc-tuned\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.134353 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:41.134043 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:41.134353 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:41.134067 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:41.134353 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:41.134079 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s9nzw for pod openshift-network-diagnostics/network-check-target-8c8cd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:41.134353 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:41.134143 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw podName:4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac nodeName:}" failed. No retries permitted until 2026-04-23 17:58:41.634122685 +0000 UTC m=+3.100717473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s9nzw" (UniqueName: "kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw") pod "network-check-target-8c8cd" (UID: "4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:41.134914 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.134888 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdqrn\" (UniqueName: \"kubernetes.io/projected/23fb900a-d97c-474d-b2b4-649024dc77d9-kube-api-access-mdqrn\") pod \"multus-additional-cni-plugins-p7mrf\" (UID: \"23fb900a-d97c-474d-b2b4-649024dc77d9\") " pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.135039 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.135018 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p6d7\" (UniqueName: \"kubernetes.io/projected/22139571-cd15-4427-b88f-162e4339848d-kube-api-access-8p6d7\") pod \"tuned-zb5m5\" (UID: \"22139571-cd15-4427-b88f-162e4339848d\") " pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.135818 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.135794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrmws\" (UniqueName: \"kubernetes.io/projected/9eb4f421-178c-437b-9788-b5d35c25462a-kube-api-access-qrmws\") pod \"aws-ebs-csi-driver-node-bksl6\" (UID: \"9eb4f421-178c-437b-9788-b5d35c25462a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.136559 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.136480 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml8vj\" (UniqueName: \"kubernetes.io/projected/39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20-kube-api-access-ml8vj\") pod \"node-resolver-799b7\" (UID: \"39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20\") " pod="openshift-dns/node-resolver-799b7" Apr 23 17:58:41.136913 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.136892 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmvtg\" (UniqueName: \"kubernetes.io/projected/a5556727-cb66-4227-ad85-c113c4a3cd70-kube-api-access-gmvtg\") pod \"multus-7wxpl\" (UID: \"a5556727-cb66-4227-ad85-c113c4a3cd70\") " pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.137173 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.137153 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tscl8\" (UniqueName: \"kubernetes.io/projected/554e663c-f1ef-46e3-bfb8-dc66a756354c-kube-api-access-tscl8\") pod \"ovnkube-node-xfxr2\" (UID: \"554e663c-f1ef-46e3-bfb8-dc66a756354c\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.230319 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.229887 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgjp2\" (UniqueName: \"kubernetes.io/projected/6296d6a8-88ab-4cde-8f0b-ca707b8e5b51-kube-api-access-kgjp2\") pod \"node-ca-9z6qm\" (UID: \"6296d6a8-88ab-4cde-8f0b-ca707b8e5b51\") " pod="openshift-image-registry/node-ca-9z6qm" Apr 23 17:58:41.230319 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.230174 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs\") pod \"network-metrics-daemon-g6b5k\" (UID: \"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa\") " pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:41.230319 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.230226 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6296d6a8-88ab-4cde-8f0b-ca707b8e5b51-host\") pod \"node-ca-9z6qm\" (UID: \"6296d6a8-88ab-4cde-8f0b-ca707b8e5b51\") " pod="openshift-image-registry/node-ca-9z6qm" Apr 23 17:58:41.230319 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.230284 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6296d6a8-88ab-4cde-8f0b-ca707b8e5b51-host\") pod \"node-ca-9z6qm\" (UID: \"6296d6a8-88ab-4cde-8f0b-ca707b8e5b51\") " pod="openshift-image-registry/node-ca-9z6qm" Apr 23 17:58:41.230669 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:41.230324 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:41.230669 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.230333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f0d9878b-7280-4232-a0c9-247ed15ce7a8-iptables-alerter-script\") pod \"iptables-alerter-nnpn7\" (UID: \"f0d9878b-7280-4232-a0c9-247ed15ce7a8\") " pod="openshift-network-operator/iptables-alerter-nnpn7" Apr 23 17:58:41.230669 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:41.230394 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs podName:2bffdfc3-a546-4e7a-b9fd-46395fbcfffa nodeName:}" failed. No retries permitted until 2026-04-23 17:58:41.730373358 +0000 UTC m=+3.196968160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs") pod "network-metrics-daemon-g6b5k" (UID: "2bffdfc3-a546-4e7a-b9fd-46395fbcfffa") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:41.230669 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.230428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jd8q6\" (UniqueName: \"kubernetes.io/projected/f0d9878b-7280-4232-a0c9-247ed15ce7a8-kube-api-access-jd8q6\") pod \"iptables-alerter-nnpn7\" (UID: \"f0d9878b-7280-4232-a0c9-247ed15ce7a8\") " pod="openshift-network-operator/iptables-alerter-nnpn7" Apr 23 17:58:41.230669 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.230455 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6296d6a8-88ab-4cde-8f0b-ca707b8e5b51-serviceca\") pod \"node-ca-9z6qm\" (UID: \"6296d6a8-88ab-4cde-8f0b-ca707b8e5b51\") " pod="openshift-image-registry/node-ca-9z6qm" Apr 23 17:58:41.230669 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.230485 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqh5z\" (UniqueName: \"kubernetes.io/projected/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-kube-api-access-cqh5z\") pod \"network-metrics-daemon-g6b5k\" (UID: \"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa\") " pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:41.230669 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.230521 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0d9878b-7280-4232-a0c9-247ed15ce7a8-host-slash\") pod \"iptables-alerter-nnpn7\" (UID: \"f0d9878b-7280-4232-a0c9-247ed15ce7a8\") " pod="openshift-network-operator/iptables-alerter-nnpn7" Apr 23 17:58:41.230669 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.230605 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0d9878b-7280-4232-a0c9-247ed15ce7a8-host-slash\") pod \"iptables-alerter-nnpn7\" (UID: \"f0d9878b-7280-4232-a0c9-247ed15ce7a8\") " pod="openshift-network-operator/iptables-alerter-nnpn7" Apr 23 17:58:41.231068 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.230905 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f0d9878b-7280-4232-a0c9-247ed15ce7a8-iptables-alerter-script\") pod \"iptables-alerter-nnpn7\" (UID: \"f0d9878b-7280-4232-a0c9-247ed15ce7a8\") " pod="openshift-network-operator/iptables-alerter-nnpn7" Apr 23 17:58:41.231068 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.231019 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6296d6a8-88ab-4cde-8f0b-ca707b8e5b51-serviceca\") pod \"node-ca-9z6qm\" (UID: \"6296d6a8-88ab-4cde-8f0b-ca707b8e5b51\") " pod="openshift-image-registry/node-ca-9z6qm" Apr 23 17:58:41.240284 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.240253 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd8q6\" (UniqueName: \"kubernetes.io/projected/f0d9878b-7280-4232-a0c9-247ed15ce7a8-kube-api-access-jd8q6\") pod \"iptables-alerter-nnpn7\" (UID: \"f0d9878b-7280-4232-a0c9-247ed15ce7a8\") " pod="openshift-network-operator/iptables-alerter-nnpn7" Apr 23 17:58:41.240446 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.240427 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgjp2\" (UniqueName: \"kubernetes.io/projected/6296d6a8-88ab-4cde-8f0b-ca707b8e5b51-kube-api-access-kgjp2\") pod \"node-ca-9z6qm\" (UID: \"6296d6a8-88ab-4cde-8f0b-ca707b8e5b51\") " pod="openshift-image-registry/node-ca-9z6qm" Apr 23 17:58:41.240508 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.240491 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqh5z\" (UniqueName: \"kubernetes.io/projected/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-kube-api-access-cqh5z\") pod \"network-metrics-daemon-g6b5k\" (UID: \"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa\") " pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:41.313205 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.313171 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bkbbs" Apr 23 17:58:41.320960 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.320928 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" Apr 23 17:58:41.330060 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.330035 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-799b7" Apr 23 17:58:41.335729 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.335706 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7wxpl" Apr 23 17:58:41.341382 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.341356 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:58:41.346185 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.346160 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" Apr 23 17:58:41.352876 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.352852 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9z6qm" Apr 23 17:58:41.361120 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.361096 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p7mrf" Apr 23 17:58:41.364793 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.364740 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nnpn7" Apr 23 17:58:41.489271 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.489235 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:41.627737 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.627673 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:41.634829 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.634802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9nzw\" (UniqueName: \"kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw\") pod \"network-check-target-8c8cd\" (UID: \"4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac\") " pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:41.634984 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:41.634967 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:41.635025 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:41.634988 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:41.635025 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:41.634998 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s9nzw for pod openshift-network-diagnostics/network-check-target-8c8cd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:41.635085 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:41.635047 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw podName:4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac nodeName:}" failed. No retries permitted until 2026-04-23 17:58:42.635032606 +0000 UTC m=+4.101627392 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s9nzw" (UniqueName: "kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw") pod "network-check-target-8c8cd" (UID: "4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:41.696376 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:41.696337 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22139571_cd15_4427_b88f_162e4339848d.slice/crio-d3a249e63927fb402291db1626298cf500627041b76ac6c23f77eb04b914b5aa WatchSource:0}: Error finding container d3a249e63927fb402291db1626298cf500627041b76ac6c23f77eb04b914b5aa: Status 404 returned error can't find the container with id d3a249e63927fb402291db1626298cf500627041b76ac6c23f77eb04b914b5aa Apr 23 17:58:41.698082 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:41.698064 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39b5a3bb_ca15_4b0b_90d9_fb7ca1985f20.slice/crio-e01683e940345c78f607c246fb41fd5b457cdf743f77c0bc806c020e01944e1d WatchSource:0}: Error finding container e01683e940345c78f607c246fb41fd5b457cdf743f77c0bc806c020e01944e1d: Status 404 returned error can't find the container with id e01683e940345c78f607c246fb41fd5b457cdf743f77c0bc806c020e01944e1d Apr 23 17:58:41.699242 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:41.699214 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2791436a_c956_4bbf_81a8_9cf1dff161c4.slice/crio-f4d0c484df6fc5ccbd26b908f2409872ff33608d0c9cd292422fde91065cdc49 WatchSource:0}: Error finding container f4d0c484df6fc5ccbd26b908f2409872ff33608d0c9cd292422fde91065cdc49: Status 404 returned error can't find the container with id f4d0c484df6fc5ccbd26b908f2409872ff33608d0c9cd292422fde91065cdc49 Apr 23 17:58:41.703798 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:41.703772 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0d9878b_7280_4232_a0c9_247ed15ce7a8.slice/crio-f0dd6c3693379f1989a976aaf07c3617b0ed9f0481973a1a5e4c57851f25c3d2 WatchSource:0}: Error finding container f0dd6c3693379f1989a976aaf07c3617b0ed9f0481973a1a5e4c57851f25c3d2: Status 404 returned error can't find the container with id f0dd6c3693379f1989a976aaf07c3617b0ed9f0481973a1a5e4c57851f25c3d2 Apr 23 17:58:41.704766 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:41.704743 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fb900a_d97c_474d_b2b4_649024dc77d9.slice/crio-18b62999488c942d8e04fabeee185a415b60c51e82f42b4d73bd869b36975504 WatchSource:0}: Error finding container 18b62999488c942d8e04fabeee185a415b60c51e82f42b4d73bd869b36975504: Status 404 returned error can't find the container with id 18b62999488c942d8e04fabeee185a415b60c51e82f42b4d73bd869b36975504 Apr 23 17:58:41.705786 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:41.705742 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod554e663c_f1ef_46e3_bfb8_dc66a756354c.slice/crio-d255cca483e343737f508b1f1ae7f4edcae2d158877658dac0662700a9854670 WatchSource:0}: Error finding container d255cca483e343737f508b1f1ae7f4edcae2d158877658dac0662700a9854670: Status 404 returned error can't find the container with id d255cca483e343737f508b1f1ae7f4edcae2d158877658dac0662700a9854670 Apr 23 17:58:41.706595 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:41.706567 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6296d6a8_88ab_4cde_8f0b_ca707b8e5b51.slice/crio-f4d02adbdd059ece0a9b97f500233ffd9eb95c9e41d45baa455bbdd47a3ef81a WatchSource:0}: Error finding container f4d02adbdd059ece0a9b97f500233ffd9eb95c9e41d45baa455bbdd47a3ef81a: Status 404 returned error can't find the container with id f4d02adbdd059ece0a9b97f500233ffd9eb95c9e41d45baa455bbdd47a3ef81a Apr 23 17:58:41.707520 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:41.707497 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9eb4f421_178c_437b_9788_b5d35c25462a.slice/crio-300328eb85ffdabc9a574fc85577cdd8abc1cc84603b6baf859ef228340c941e WatchSource:0}: Error finding container 300328eb85ffdabc9a574fc85577cdd8abc1cc84603b6baf859ef228340c941e: Status 404 returned error can't find the container with id 300328eb85ffdabc9a574fc85577cdd8abc1cc84603b6baf859ef228340c941e Apr 23 17:58:41.710753 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:58:41.710581 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5556727_cb66_4227_ad85_c113c4a3cd70.slice/crio-d62a7a006375b42dd43b368df75af52533989c00daf75eb6b793f0399d1d14d2 WatchSource:0}: Error finding container d62a7a006375b42dd43b368df75af52533989c00daf75eb6b793f0399d1d14d2: Status 404 returned error can't find the container with id d62a7a006375b42dd43b368df75af52533989c00daf75eb6b793f0399d1d14d2 Apr 23 17:58:41.735463 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:41.735437 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs\") pod \"network-metrics-daemon-g6b5k\" (UID: \"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa\") " pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:41.735599 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:41.735574 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:41.735639 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:41.735624 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs podName:2bffdfc3-a546-4e7a-b9fd-46395fbcfffa nodeName:}" failed. No retries permitted until 2026-04-23 17:58:42.735610501 +0000 UTC m=+4.202205286 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs") pod "network-metrics-daemon-g6b5k" (UID: "2bffdfc3-a546-4e7a-b9fd-46395fbcfffa") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:42.061822 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:42.061695 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:53:40 +0000 UTC" deadline="2027-10-17 12:43:18.192089311 +0000 UTC" Apr 23 17:58:42.061822 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:42.061736 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13002h44m36.130356924s" Apr 23 17:58:42.172551 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:42.172494 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" event={"ID":"22139571-cd15-4427-b88f-162e4339848d","Type":"ContainerStarted","Data":"d3a249e63927fb402291db1626298cf500627041b76ac6c23f77eb04b914b5aa"} Apr 23 17:58:42.180996 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:42.179023 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-229.ec2.internal" event={"ID":"43b465d4e8bfce095d9d53677dbda72a","Type":"ContainerStarted","Data":"64cb05831cb4ada6d8f935650af496cd5cffffa9300ac3b464f519f15e9fdc21"} Apr 23 17:58:42.187131 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:42.187066 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" event={"ID":"554e663c-f1ef-46e3-bfb8-dc66a756354c","Type":"ContainerStarted","Data":"d255cca483e343737f508b1f1ae7f4edcae2d158877658dac0662700a9854670"} Apr 23 17:58:42.196992 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:42.196965 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p7mrf" event={"ID":"23fb900a-d97c-474d-b2b4-649024dc77d9","Type":"ContainerStarted","Data":"18b62999488c942d8e04fabeee185a415b60c51e82f42b4d73bd869b36975504"} Apr 23 17:58:42.206856 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:42.206834 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bkbbs" event={"ID":"2791436a-c956-4bbf-81a8-9cf1dff161c4","Type":"ContainerStarted","Data":"f4d0c484df6fc5ccbd26b908f2409872ff33608d0c9cd292422fde91065cdc49"} Apr 23 17:58:42.220771 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:42.220746 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-799b7" event={"ID":"39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20","Type":"ContainerStarted","Data":"e01683e940345c78f607c246fb41fd5b457cdf743f77c0bc806c020e01944e1d"} Apr 23 17:58:42.226034 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:42.226010 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7wxpl" event={"ID":"a5556727-cb66-4227-ad85-c113c4a3cd70","Type":"ContainerStarted","Data":"d62a7a006375b42dd43b368df75af52533989c00daf75eb6b793f0399d1d14d2"} Apr 23 17:58:42.227773 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:42.227726 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" event={"ID":"9eb4f421-178c-437b-9788-b5d35c25462a","Type":"ContainerStarted","Data":"300328eb85ffdabc9a574fc85577cdd8abc1cc84603b6baf859ef228340c941e"} Apr 23 17:58:42.232665 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:42.232610 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9z6qm" event={"ID":"6296d6a8-88ab-4cde-8f0b-ca707b8e5b51","Type":"ContainerStarted","Data":"f4d02adbdd059ece0a9b97f500233ffd9eb95c9e41d45baa455bbdd47a3ef81a"} Apr 23 17:58:42.236461 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:42.235868 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nnpn7" event={"ID":"f0d9878b-7280-4232-a0c9-247ed15ce7a8","Type":"ContainerStarted","Data":"f0dd6c3693379f1989a976aaf07c3617b0ed9f0481973a1a5e4c57851f25c3d2"} Apr 23 17:58:42.641891 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:42.641774 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9nzw\" (UniqueName: \"kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw\") pod \"network-check-target-8c8cd\" (UID: \"4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac\") " pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:42.642054 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:42.641944 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:42.642054 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:42.641963 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:42.642054 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:42.641977 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s9nzw for pod openshift-network-diagnostics/network-check-target-8c8cd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:42.642054 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:42.642040 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw podName:4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac nodeName:}" failed. No retries permitted until 2026-04-23 17:58:44.642019889 +0000 UTC m=+6.108614691 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s9nzw" (UniqueName: "kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw") pod "network-check-target-8c8cd" (UID: "4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:42.742942 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:42.742894 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs\") pod \"network-metrics-daemon-g6b5k\" (UID: \"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa\") " pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:42.743135 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:42.743090 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:42.743194 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:42.743153 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs podName:2bffdfc3-a546-4e7a-b9fd-46395fbcfffa nodeName:}" failed. No retries permitted until 2026-04-23 17:58:44.743133679 +0000 UTC m=+6.209728470 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs") pod "network-metrics-daemon-g6b5k" (UID: "2bffdfc3-a546-4e7a-b9fd-46395fbcfffa") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:43.157733 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:43.157699 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:43.158198 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:43.157849 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:58:43.158463 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:43.158443 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:43.158592 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:43.158569 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:58:43.258703 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:43.257855 2577 generic.go:358] "Generic (PLEG): container finished" podID="edf64edd20dcf0bca603d270a7bdadff" containerID="45d66bd9e16b60aa9a7b96e1c2b5e4bdeeafd0851d87514b69667574fec5ca2a" exitCode=0 Apr 23 17:58:43.258703 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:43.258638 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal" event={"ID":"edf64edd20dcf0bca603d270a7bdadff","Type":"ContainerDied","Data":"45d66bd9e16b60aa9a7b96e1c2b5e4bdeeafd0851d87514b69667574fec5ca2a"} Apr 23 17:58:43.290410 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:43.290352 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-229.ec2.internal" podStartSLOduration=3.290329679 podStartE2EDuration="3.290329679s" podCreationTimestamp="2026-04-23 17:58:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:58:42.195924436 +0000 UTC m=+3.662519246" watchObservedRunningTime="2026-04-23 17:58:43.290329679 +0000 UTC m=+4.756924491" Apr 23 17:58:44.279734 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:44.279694 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal" event={"ID":"edf64edd20dcf0bca603d270a7bdadff","Type":"ContainerStarted","Data":"c7d330c5ede17eb249e7b5b96f1bd4074eb8ed52098252aac7c3232dfd9a026d"} Apr 23 17:58:44.296449 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:44.296390 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-229.ec2.internal" podStartSLOduration=4.296372828 podStartE2EDuration="4.296372828s" podCreationTimestamp="2026-04-23 17:58:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:58:44.295825917 +0000 UTC m=+5.762420750" watchObservedRunningTime="2026-04-23 17:58:44.296372828 +0000 UTC m=+5.762967638" Apr 23 17:58:44.658971 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:44.658877 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9nzw\" (UniqueName: \"kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw\") pod \"network-check-target-8c8cd\" (UID: \"4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac\") " pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:44.659140 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:44.659100 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:44.659140 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:44.659122 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:44.659140 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:44.659137 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s9nzw for pod openshift-network-diagnostics/network-check-target-8c8cd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:44.659302 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:44.659205 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw podName:4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac nodeName:}" failed. No retries permitted until 2026-04-23 17:58:48.659184054 +0000 UTC m=+10.125778849 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s9nzw" (UniqueName: "kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw") pod "network-check-target-8c8cd" (UID: "4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:44.759797 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:44.759747 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs\") pod \"network-metrics-daemon-g6b5k\" (UID: \"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa\") " pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:44.759994 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:44.759935 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:44.760085 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:44.760062 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs podName:2bffdfc3-a546-4e7a-b9fd-46395fbcfffa nodeName:}" failed. No retries permitted until 2026-04-23 17:58:48.760045341 +0000 UTC m=+10.226640138 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs") pod "network-metrics-daemon-g6b5k" (UID: "2bffdfc3-a546-4e7a-b9fd-46395fbcfffa") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:45.158779 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:45.158696 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:45.158927 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:45.158823 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:58:45.161179 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:45.161153 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:45.161296 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:45.161262 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:58:47.160657 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:47.160615 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:47.161099 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:47.160741 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:58:47.161180 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:47.161105 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:47.161241 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:47.161217 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:58:48.693573 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:48.693522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9nzw\" (UniqueName: \"kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw\") pod \"network-check-target-8c8cd\" (UID: \"4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac\") " pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:48.694136 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:48.693734 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:48.694136 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:48.693757 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:48.694136 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:48.693771 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s9nzw for pod openshift-network-diagnostics/network-check-target-8c8cd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:48.694136 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:48.693828 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw podName:4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac nodeName:}" failed. No retries permitted until 2026-04-23 17:58:56.6938077 +0000 UTC m=+18.160402486 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s9nzw" (UniqueName: "kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw") pod "network-check-target-8c8cd" (UID: "4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:48.794428 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:48.794319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs\") pod \"network-metrics-daemon-g6b5k\" (UID: \"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa\") " pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:48.794653 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:48.794467 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:48.794653 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:48.794556 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs podName:2bffdfc3-a546-4e7a-b9fd-46395fbcfffa nodeName:}" failed. No retries permitted until 2026-04-23 17:58:56.794518183 +0000 UTC m=+18.261112973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs") pod "network-metrics-daemon-g6b5k" (UID: "2bffdfc3-a546-4e7a-b9fd-46395fbcfffa") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:49.159006 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:49.158612 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:49.159006 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:49.158750 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:58:49.159006 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:49.158765 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:49.159006 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:49.158877 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:58:51.157824 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:51.157781 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:51.158220 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:51.157801 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:51.158220 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:51.157906 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:58:51.158220 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:51.157992 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:58:53.158288 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:53.158195 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:53.158768 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:53.158317 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:58:53.158768 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:53.158372 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:53.158768 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:53.158463 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:58:55.158213 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:55.158176 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:55.158684 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:55.158334 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:58:55.158684 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:55.158395 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:55.158684 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:55.158517 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:58:56.752824 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:56.752781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9nzw\" (UniqueName: \"kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw\") pod \"network-check-target-8c8cd\" (UID: \"4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac\") " pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:56.753277 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:56.752936 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:56.753277 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:56.752957 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:56.753277 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:56.752968 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s9nzw for pod openshift-network-diagnostics/network-check-target-8c8cd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:56.753277 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:56.753032 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw podName:4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac nodeName:}" failed. No retries permitted until 2026-04-23 17:59:12.753010859 +0000 UTC m=+34.219605651 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s9nzw" (UniqueName: "kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw") pod "network-check-target-8c8cd" (UID: "4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:56.854063 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:56.854017 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs\") pod \"network-metrics-daemon-g6b5k\" (UID: \"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa\") " pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:56.854253 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:56.854163 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:56.854253 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:56.854232 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs podName:2bffdfc3-a546-4e7a-b9fd-46395fbcfffa nodeName:}" failed. No retries permitted until 2026-04-23 17:59:12.85421327 +0000 UTC m=+34.320808058 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs") pod "network-metrics-daemon-g6b5k" (UID: "2bffdfc3-a546-4e7a-b9fd-46395fbcfffa") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:57.157647 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:57.157611 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:57.157838 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:57.157629 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:57.157838 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:57.157738 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:58:57.157966 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:57.157846 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:58:59.157847 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:59.157809 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:58:59.158226 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:59.157885 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:58:59.158226 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:59.157895 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:58:59.158226 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:58:59.157975 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:58:59.314408 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:58:59.314185 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bkbbs" event={"ID":"2791436a-c956-4bbf-81a8-9cf1dff161c4","Type":"ContainerStarted","Data":"bc24bf22f65eccb12aaba2f26a9d05b27879b25eff293c5f1aed44a509cfd055"} Apr 23 17:59:00.317167 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.316928 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7wxpl" event={"ID":"a5556727-cb66-4227-ad85-c113c4a3cd70","Type":"ContainerStarted","Data":"f0bb700ad51eb94545dd7328f2699b424f51455a8e6a87f3941d52610e4eee54"} Apr 23 17:59:00.318197 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.318173 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" event={"ID":"9eb4f421-178c-437b-9788-b5d35c25462a","Type":"ContainerStarted","Data":"06f8d2e29d22afce9ef217b0aa9cb8ada07ef55cc21e4b5ddc9b3228ad424fda"} Apr 23 17:59:00.319363 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.319342 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9z6qm" event={"ID":"6296d6a8-88ab-4cde-8f0b-ca707b8e5b51","Type":"ContainerStarted","Data":"820d1d93f73ea2694273c6a0b0cd1112c7cc660fdf41f9c7154fc1ce5d34d187"} Apr 23 17:59:00.320507 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.320489 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" event={"ID":"22139571-cd15-4427-b88f-162e4339848d","Type":"ContainerStarted","Data":"e4b925a02488320ceb19082ef85985c431b40e75673dfc14c1fc93dcbc97e005"} Apr 23 17:59:00.322749 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.322733 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 17:59:00.323023 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.323002 2577 generic.go:358] "Generic (PLEG): container finished" podID="554e663c-f1ef-46e3-bfb8-dc66a756354c" containerID="e32735bc2a24388ea3eef6045b20bf6988ee125e608d454ed833ce9248d1795d" exitCode=1 Apr 23 17:59:00.323098 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.323057 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" event={"ID":"554e663c-f1ef-46e3-bfb8-dc66a756354c","Type":"ContainerStarted","Data":"4ea57bdcf121be067d8e7bccb567a2ca1d69d10f77848694e03249133d3fd60b"} Apr 23 17:59:00.323098 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.323075 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" event={"ID":"554e663c-f1ef-46e3-bfb8-dc66a756354c","Type":"ContainerStarted","Data":"f0037bee4390a7adcf649cbb8f54b44717e2089ba64fc3b4295a725142b488f2"} Apr 23 17:59:00.323098 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.323087 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" event={"ID":"554e663c-f1ef-46e3-bfb8-dc66a756354c","Type":"ContainerStarted","Data":"df529f321ec1ee07e060950ea4b17275bef50a2bd6b5850a9f28cb0a3582e2fa"} Apr 23 17:59:00.323098 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.323097 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" event={"ID":"554e663c-f1ef-46e3-bfb8-dc66a756354c","Type":"ContainerStarted","Data":"1a78a35e4cffbfdcc6996bb68750c76e62b8fcd857fd5562d1f5ddfe799282c0"} Apr 23 17:59:00.323215 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.323105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" event={"ID":"554e663c-f1ef-46e3-bfb8-dc66a756354c","Type":"ContainerDied","Data":"e32735bc2a24388ea3eef6045b20bf6988ee125e608d454ed833ce9248d1795d"} Apr 23 17:59:00.323215 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.323115 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" event={"ID":"554e663c-f1ef-46e3-bfb8-dc66a756354c","Type":"ContainerStarted","Data":"f35e261bf135530c44338692489222cb512515c8fe5693fe790314e69161c2d8"} Apr 23 17:59:00.324215 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.324193 2577 generic.go:358] "Generic (PLEG): container finished" podID="23fb900a-d97c-474d-b2b4-649024dc77d9" containerID="c7320560868b8798ea1e5c2e3fbfac2f8c205c9379b935275a0e828971d74a43" exitCode=0 Apr 23 17:59:00.324311 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.324250 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p7mrf" event={"ID":"23fb900a-d97c-474d-b2b4-649024dc77d9","Type":"ContainerDied","Data":"c7320560868b8798ea1e5c2e3fbfac2f8c205c9379b935275a0e828971d74a43"} Apr 23 17:59:00.325457 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.325437 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-799b7" event={"ID":"39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20","Type":"ContainerStarted","Data":"6418a0682515855a47ed4e44efe5933ab8d08146dc96eea8c0d928df424d66f1"} Apr 23 17:59:00.338748 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.338705 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7wxpl" podStartSLOduration=3.888980327 podStartE2EDuration="21.338690088s" podCreationTimestamp="2026-04-23 17:58:39 +0000 UTC" firstStartedPulling="2026-04-23 17:58:41.713455027 +0000 UTC m=+3.180049813" lastFinishedPulling="2026-04-23 17:58:59.163164775 +0000 UTC m=+20.629759574" observedRunningTime="2026-04-23 17:59:00.338204343 +0000 UTC m=+21.804799163" watchObservedRunningTime="2026-04-23 17:59:00.338690088 +0000 UTC m=+21.805284896" Apr 23 17:59:00.370094 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.370046 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zb5m5" podStartSLOduration=3.903890362 podStartE2EDuration="21.370031495s" podCreationTimestamp="2026-04-23 17:58:39 +0000 UTC" firstStartedPulling="2026-04-23 17:58:41.698286149 +0000 UTC m=+3.164880935" lastFinishedPulling="2026-04-23 17:58:59.164427268 +0000 UTC m=+20.631022068" observedRunningTime="2026-04-23 17:59:00.353821417 +0000 UTC m=+21.820416224" watchObservedRunningTime="2026-04-23 17:59:00.370031495 +0000 UTC m=+21.836626303" Apr 23 17:59:00.370227 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.370175 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9z6qm" podStartSLOduration=3.947845295 podStartE2EDuration="21.370169898s" podCreationTimestamp="2026-04-23 17:58:39 +0000 UTC" firstStartedPulling="2026-04-23 17:58:41.710855151 +0000 UTC m=+3.177449946" lastFinishedPulling="2026-04-23 17:58:59.133179744 +0000 UTC m=+20.599774549" observedRunningTime="2026-04-23 17:59:00.370165319 +0000 UTC m=+21.836760106" watchObservedRunningTime="2026-04-23 17:59:00.370169898 +0000 UTC m=+21.836764705" Apr 23 17:59:00.404893 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.404834 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-799b7" podStartSLOduration=3.9720200820000002 podStartE2EDuration="21.404820869s" podCreationTimestamp="2026-04-23 17:58:39 +0000 UTC" firstStartedPulling="2026-04-23 17:58:41.700380534 +0000 UTC m=+3.166975325" lastFinishedPulling="2026-04-23 17:58:59.133181323 +0000 UTC m=+20.599776112" observedRunningTime="2026-04-23 17:59:00.404350552 +0000 UTC m=+21.870945360" watchObservedRunningTime="2026-04-23 17:59:00.404820869 +0000 UTC m=+21.871415676" Apr 23 17:59:00.417701 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.417659 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-bkbbs" podStartSLOduration=8.759045486 podStartE2EDuration="21.417647073s" podCreationTimestamp="2026-04-23 17:58:39 +0000 UTC" firstStartedPulling="2026-04-23 17:58:41.702082631 +0000 UTC m=+3.168677420" lastFinishedPulling="2026-04-23 17:58:54.360684217 +0000 UTC m=+15.827279007" observedRunningTime="2026-04-23 17:59:00.417333815 +0000 UTC m=+21.883928623" watchObservedRunningTime="2026-04-23 17:59:00.417647073 +0000 UTC m=+21.884241880" Apr 23 17:59:00.700870 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:00.700700 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 17:59:01.088052 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:01.087949 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T17:59:00.700867163Z","UUID":"6b391e03-b700-47e1-98e2-a30bbc4b2b7a","Handler":null,"Name":"","Endpoint":""} Apr 23 17:59:01.089998 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:01.089971 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 17:59:01.089998 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:01.090004 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 17:59:01.158074 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:01.158037 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:59:01.158253 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:01.158178 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:59:01.158520 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:01.158051 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:59:01.158644 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:01.158623 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:59:01.330539 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:01.330486 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" event={"ID":"9eb4f421-178c-437b-9788-b5d35c25462a","Type":"ContainerStarted","Data":"53f21919e39d4905e9d060dda24fcf46c38b9e2bfc77ede252459b342b8f253a"} Apr 23 17:59:01.331978 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:01.331948 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nnpn7" event={"ID":"f0d9878b-7280-4232-a0c9-247ed15ce7a8","Type":"ContainerStarted","Data":"4cba260c39d354f48d0dc9e583abed90b3a067cb89f90e3fbbac677ee07da27e"} Apr 23 17:59:01.347999 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:01.347900 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-nnpn7" podStartSLOduration=4.920400385 podStartE2EDuration="22.347884514s" podCreationTimestamp="2026-04-23 17:58:39 +0000 UTC" firstStartedPulling="2026-04-23 17:58:41.705760364 +0000 UTC m=+3.172355150" lastFinishedPulling="2026-04-23 17:58:59.133244493 +0000 UTC m=+20.599839279" observedRunningTime="2026-04-23 17:59:01.347747966 +0000 UTC m=+22.814342773" watchObservedRunningTime="2026-04-23 17:59:01.347884514 +0000 UTC m=+22.814479321" Apr 23 17:59:02.336480 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:02.336438 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" event={"ID":"9eb4f421-178c-437b-9788-b5d35c25462a","Type":"ContainerStarted","Data":"73d76fe5672de1a70691acf334f9dce0d1354cfa1ff5d76b8f6fc937a858b531"} Apr 23 17:59:02.342436 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:02.340001 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 17:59:02.342901 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:02.342874 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" event={"ID":"554e663c-f1ef-46e3-bfb8-dc66a756354c","Type":"ContainerStarted","Data":"2be4895c1b701266ae89da5d486dd19f792b9197a2bdc02c56bc7cc0c40ad228"} Apr 23 17:59:02.358314 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:02.358260 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bksl6" podStartSLOduration=3.250421878 podStartE2EDuration="23.358246061s" podCreationTimestamp="2026-04-23 17:58:39 +0000 UTC" firstStartedPulling="2026-04-23 17:58:41.711177589 +0000 UTC m=+3.177772375" lastFinishedPulling="2026-04-23 17:59:01.819001755 +0000 UTC m=+23.285596558" observedRunningTime="2026-04-23 17:59:02.357765328 +0000 UTC m=+23.824360140" watchObservedRunningTime="2026-04-23 17:59:02.358246061 +0000 UTC m=+23.824840866" Apr 23 17:59:03.158333 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:03.158291 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:59:03.158498 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:03.158308 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:59:03.158498 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:03.158445 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:59:03.158662 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:03.158508 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:59:03.489762 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:03.489674 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-bkbbs" Apr 23 17:59:03.490470 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:03.490449 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-bkbbs" Apr 23 17:59:04.346339 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:04.346307 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-bkbbs" Apr 23 17:59:04.346894 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:04.346878 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-bkbbs" Apr 23 17:59:05.160447 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:05.160279 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:59:05.160921 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:05.160307 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:59:05.160921 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:05.160547 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:59:05.160921 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:05.160699 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:59:05.350349 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:05.350318 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 17:59:05.350712 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:05.350665 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" event={"ID":"554e663c-f1ef-46e3-bfb8-dc66a756354c","Type":"ContainerStarted","Data":"327265a2ed557da09bbfe8206d30a8ad4de0cc834eac99d407ccf4b9486b3303"} Apr 23 17:59:05.351046 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:05.351026 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:59:05.351200 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:05.351055 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:59:05.351200 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:05.351190 2577 scope.go:117] "RemoveContainer" containerID="e32735bc2a24388ea3eef6045b20bf6988ee125e608d454ed833ce9248d1795d" Apr 23 17:59:05.356171 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:05.356145 2577 generic.go:358] "Generic (PLEG): container finished" podID="23fb900a-d97c-474d-b2b4-649024dc77d9" containerID="e54db29e930fe533a7b3baf121deade8a3c490be10d83eba75a7dc3f95d1441e" exitCode=0 Apr 23 17:59:05.356313 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:05.356237 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p7mrf" event={"ID":"23fb900a-d97c-474d-b2b4-649024dc77d9","Type":"ContainerDied","Data":"e54db29e930fe533a7b3baf121deade8a3c490be10d83eba75a7dc3f95d1441e"} Apr 23 17:59:05.369415 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:05.369388 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:59:06.335055 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:06.335027 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g6b5k"] Apr 23 17:59:06.335505 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:06.335141 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:59:06.335505 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:06.335225 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:59:06.343218 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:06.343196 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8c8cd"] Apr 23 17:59:06.343321 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:06.343298 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:59:06.343382 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:06.343365 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:59:06.361830 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:06.361803 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 17:59:06.362799 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:06.362769 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" event={"ID":"554e663c-f1ef-46e3-bfb8-dc66a756354c","Type":"ContainerStarted","Data":"cbd32ec8bb3fc06cf91858591dd65f75dca2c75b7431dd43693df500358f92e7"} Apr 23 17:59:06.365858 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:06.365836 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:59:06.368314 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:06.368290 2577 generic.go:358] "Generic (PLEG): container finished" podID="23fb900a-d97c-474d-b2b4-649024dc77d9" containerID="ece09896b4ef2db6ece37b156119713355691f2275b3528cc847e3b52faf95fc" exitCode=0 Apr 23 17:59:06.368413 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:06.368374 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p7mrf" event={"ID":"23fb900a-d97c-474d-b2b4-649024dc77d9","Type":"ContainerDied","Data":"ece09896b4ef2db6ece37b156119713355691f2275b3528cc847e3b52faf95fc"} Apr 23 17:59:06.384352 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:06.384319 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:59:06.404023 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:06.403969 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" podStartSLOduration=9.894327754 podStartE2EDuration="27.403955691s" podCreationTimestamp="2026-04-23 17:58:39 +0000 UTC" firstStartedPulling="2026-04-23 17:58:41.70971306 +0000 UTC m=+3.176307851" lastFinishedPulling="2026-04-23 17:58:59.219340995 +0000 UTC m=+20.685935788" observedRunningTime="2026-04-23 17:59:06.403763225 +0000 UTC m=+27.870358032" watchObservedRunningTime="2026-04-23 17:59:06.403955691 +0000 UTC m=+27.870550544" Apr 23 17:59:07.372540 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:07.372508 2577 generic.go:358] "Generic (PLEG): container finished" podID="23fb900a-d97c-474d-b2b4-649024dc77d9" containerID="a2d37f74a967131baa123b096ac5b655886dd081dc60387a2a2e0cb74272a6b6" exitCode=0 Apr 23 17:59:07.372971 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:07.372592 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p7mrf" event={"ID":"23fb900a-d97c-474d-b2b4-649024dc77d9","Type":"ContainerDied","Data":"a2d37f74a967131baa123b096ac5b655886dd081dc60387a2a2e0cb74272a6b6"} Apr 23 17:59:08.158334 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:08.158292 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:59:08.158580 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:08.158295 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:59:08.158580 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:08.158416 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:59:08.158580 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:08.158509 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:59:10.157818 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:10.157783 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:59:10.158431 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:10.157823 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:59:10.158431 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:10.157909 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:59:10.158431 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:10.158035 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:59:12.157594 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:12.157552 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:59:12.158124 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:12.157552 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:59:12.158124 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:12.157700 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8c8cd" podUID="4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac" Apr 23 17:59:12.158124 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:12.157780 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 17:59:12.780920 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:12.780876 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9nzw\" (UniqueName: \"kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw\") pod \"network-check-target-8c8cd\" (UID: \"4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac\") " pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:59:12.781130 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:12.781036 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:59:12.781130 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:12.781061 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:59:12.781130 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:12.781072 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s9nzw for pod openshift-network-diagnostics/network-check-target-8c8cd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:59:12.781130 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:12.781131 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw podName:4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac nodeName:}" failed. No retries permitted until 2026-04-23 17:59:44.781114567 +0000 UTC m=+66.247709353 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s9nzw" (UniqueName: "kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw") pod "network-check-target-8c8cd" (UID: "4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:59:12.881663 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:12.881634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs\") pod \"network-metrics-daemon-g6b5k\" (UID: \"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa\") " pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:59:12.881811 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:12.881791 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:59:12.881868 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:12.881859 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs podName:2bffdfc3-a546-4e7a-b9fd-46395fbcfffa nodeName:}" failed. No retries permitted until 2026-04-23 17:59:44.88184218 +0000 UTC m=+66.348436965 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs") pod "network-metrics-daemon-g6b5k" (UID: "2bffdfc3-a546-4e7a-b9fd-46395fbcfffa") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:59:13.294331 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.294304 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-229.ec2.internal" event="NodeReady" Apr 23 17:59:13.294708 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.294438 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 17:59:13.370610 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.370576 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9l5gk"] Apr 23 17:59:13.394701 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.394506 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wksgm"] Apr 23 17:59:13.394861 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.394666 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 17:59:13.401775 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.401747 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 17:59:13.401775 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.401763 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 17:59:13.401987 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.401815 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tsbqj\"" Apr 23 17:59:13.401987 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.401745 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 17:59:13.410324 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.410297 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9l5gk"] Apr 23 17:59:13.410324 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.410327 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wksgm"] Apr 23 17:59:13.410496 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.410460 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:13.413253 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.413231 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-sv5q9\"" Apr 23 17:59:13.413401 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.413294 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 17:59:13.419881 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.419860 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 17:59:13.486513 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.486477 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert\") pod \"ingress-canary-9l5gk\" (UID: \"1c2b2631-b55c-4750-a693-fd0de3b3687f\") " pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 17:59:13.486707 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.486557 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp775\" (UniqueName: \"kubernetes.io/projected/1c2b2631-b55c-4750-a693-fd0de3b3687f-kube-api-access-mp775\") pod \"ingress-canary-9l5gk\" (UID: \"1c2b2631-b55c-4750-a693-fd0de3b3687f\") " pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 17:59:13.587175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.587137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert\") pod \"ingress-canary-9l5gk\" (UID: \"1c2b2631-b55c-4750-a693-fd0de3b3687f\") " pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 17:59:13.587175 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.587177 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7zdw\" (UniqueName: \"kubernetes.io/projected/c9a391d0-68cc-4801-b1b2-40e572f5a934-kube-api-access-p7zdw\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:13.587407 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.587221 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mp775\" (UniqueName: \"kubernetes.io/projected/1c2b2631-b55c-4750-a693-fd0de3b3687f-kube-api-access-mp775\") pod \"ingress-canary-9l5gk\" (UID: \"1c2b2631-b55c-4750-a693-fd0de3b3687f\") " pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 17:59:13.587407 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.587289 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:13.587407 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.587334 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9a391d0-68cc-4801-b1b2-40e572f5a934-config-volume\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:13.587407 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:13.587298 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:13.587602 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:13.587420 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert podName:1c2b2631-b55c-4750-a693-fd0de3b3687f nodeName:}" failed. No retries permitted until 2026-04-23 17:59:14.087402762 +0000 UTC m=+35.553997548 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert") pod "ingress-canary-9l5gk" (UID: "1c2b2631-b55c-4750-a693-fd0de3b3687f") : secret "canary-serving-cert" not found Apr 23 17:59:13.587602 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.587358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c9a391d0-68cc-4801-b1b2-40e572f5a934-tmp-dir\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:13.611499 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.611462 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp775\" (UniqueName: \"kubernetes.io/projected/1c2b2631-b55c-4750-a693-fd0de3b3687f-kube-api-access-mp775\") pod \"ingress-canary-9l5gk\" (UID: \"1c2b2631-b55c-4750-a693-fd0de3b3687f\") " pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 17:59:13.688745 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.688704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7zdw\" (UniqueName: \"kubernetes.io/projected/c9a391d0-68cc-4801-b1b2-40e572f5a934-kube-api-access-p7zdw\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:13.688914 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.688786 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:13.688914 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.688816 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9a391d0-68cc-4801-b1b2-40e572f5a934-config-volume\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:13.688914 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.688838 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c9a391d0-68cc-4801-b1b2-40e572f5a934-tmp-dir\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:13.689051 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:13.688931 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:13.689051 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:13.689049 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls podName:c9a391d0-68cc-4801-b1b2-40e572f5a934 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:14.189030719 +0000 UTC m=+35.655625516 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls") pod "dns-default-wksgm" (UID: "c9a391d0-68cc-4801-b1b2-40e572f5a934") : secret "dns-default-metrics-tls" not found Apr 23 17:59:13.689206 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.689189 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c9a391d0-68cc-4801-b1b2-40e572f5a934-tmp-dir\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:13.698452 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.698422 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7zdw\" (UniqueName: \"kubernetes.io/projected/c9a391d0-68cc-4801-b1b2-40e572f5a934-kube-api-access-p7zdw\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:13.701100 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:13.701072 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9a391d0-68cc-4801-b1b2-40e572f5a934-config-volume\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:14.091171 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:14.091131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert\") pod \"ingress-canary-9l5gk\" (UID: \"1c2b2631-b55c-4750-a693-fd0de3b3687f\") " pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 17:59:14.091348 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:14.091268 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:14.091348 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:14.091324 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert podName:1c2b2631-b55c-4750-a693-fd0de3b3687f nodeName:}" failed. No retries permitted until 2026-04-23 17:59:15.091309866 +0000 UTC m=+36.557904652 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert") pod "ingress-canary-9l5gk" (UID: "1c2b2631-b55c-4750-a693-fd0de3b3687f") : secret "canary-serving-cert" not found Apr 23 17:59:14.158282 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:14.158244 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:59:14.158452 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:14.158430 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:59:14.161909 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:14.161886 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:59:14.162192 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:14.162175 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:59:14.162291 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:14.162274 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:59:14.163195 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:14.163177 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qjtm5\"" Apr 23 17:59:14.163195 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:14.163194 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7tv5t\"" Apr 23 17:59:14.192451 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:14.192406 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:14.192653 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:14.192519 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:14.192653 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:14.192601 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls podName:c9a391d0-68cc-4801-b1b2-40e572f5a934 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:15.192585732 +0000 UTC m=+36.659180517 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls") pod "dns-default-wksgm" (UID: "c9a391d0-68cc-4801-b1b2-40e572f5a934") : secret "dns-default-metrics-tls" not found Apr 23 17:59:14.388401 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:14.388315 2577 generic.go:358] "Generic (PLEG): container finished" podID="23fb900a-d97c-474d-b2b4-649024dc77d9" containerID="00e158dc89f7e2012fcf95e3d01f457fc4011b12e4cdb2ba9228083e5d218676" exitCode=0 Apr 23 17:59:14.388401 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:14.388361 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p7mrf" event={"ID":"23fb900a-d97c-474d-b2b4-649024dc77d9","Type":"ContainerDied","Data":"00e158dc89f7e2012fcf95e3d01f457fc4011b12e4cdb2ba9228083e5d218676"} Apr 23 17:59:15.098516 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:15.098473 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert\") pod \"ingress-canary-9l5gk\" (UID: \"1c2b2631-b55c-4750-a693-fd0de3b3687f\") " pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 17:59:15.098715 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:15.098624 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:15.098715 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:15.098698 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert podName:1c2b2631-b55c-4750-a693-fd0de3b3687f nodeName:}" failed. No retries permitted until 2026-04-23 17:59:17.098671201 +0000 UTC m=+38.565265994 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert") pod "ingress-canary-9l5gk" (UID: "1c2b2631-b55c-4750-a693-fd0de3b3687f") : secret "canary-serving-cert" not found Apr 23 17:59:15.199797 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:15.199757 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:15.199966 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:15.199902 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:15.200006 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:15.199974 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls podName:c9a391d0-68cc-4801-b1b2-40e572f5a934 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:17.199958757 +0000 UTC m=+38.666553542 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls") pod "dns-default-wksgm" (UID: "c9a391d0-68cc-4801-b1b2-40e572f5a934") : secret "dns-default-metrics-tls" not found Apr 23 17:59:15.392951 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:15.392873 2577 generic.go:358] "Generic (PLEG): container finished" podID="23fb900a-d97c-474d-b2b4-649024dc77d9" containerID="a8f7e26822089ed62822c704a9952e74117c70685ca83d5e9592908da5e431a7" exitCode=0 Apr 23 17:59:15.392951 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:15.392933 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p7mrf" event={"ID":"23fb900a-d97c-474d-b2b4-649024dc77d9","Type":"ContainerDied","Data":"a8f7e26822089ed62822c704a9952e74117c70685ca83d5e9592908da5e431a7"} Apr 23 17:59:16.397624 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:16.397592 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p7mrf" event={"ID":"23fb900a-d97c-474d-b2b4-649024dc77d9","Type":"ContainerStarted","Data":"7870cda96a6f44d153656de46299222b82c456959e2ca3415d11d60919e29e4d"} Apr 23 17:59:16.424551 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:16.424488 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-p7mrf" podStartSLOduration=5.855229849 podStartE2EDuration="37.424476388s" podCreationTimestamp="2026-04-23 17:58:39 +0000 UTC" firstStartedPulling="2026-04-23 17:58:41.706378488 +0000 UTC m=+3.172973284" lastFinishedPulling="2026-04-23 17:59:13.275625025 +0000 UTC m=+34.742219823" observedRunningTime="2026-04-23 17:59:16.423033407 +0000 UTC m=+37.889628226" watchObservedRunningTime="2026-04-23 17:59:16.424476388 +0000 UTC m=+37.891071196" Apr 23 17:59:17.113249 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:17.113210 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert\") pod \"ingress-canary-9l5gk\" (UID: \"1c2b2631-b55c-4750-a693-fd0de3b3687f\") " pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 17:59:17.113435 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:17.113356 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:17.113435 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:17.113419 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert podName:1c2b2631-b55c-4750-a693-fd0de3b3687f nodeName:}" failed. No retries permitted until 2026-04-23 17:59:21.113405124 +0000 UTC m=+42.579999910 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert") pod "ingress-canary-9l5gk" (UID: "1c2b2631-b55c-4750-a693-fd0de3b3687f") : secret "canary-serving-cert" not found Apr 23 17:59:17.214662 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:17.214627 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:17.214799 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:17.214770 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:17.214857 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:17.214847 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls podName:c9a391d0-68cc-4801-b1b2-40e572f5a934 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:21.214828537 +0000 UTC m=+42.681423324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls") pod "dns-default-wksgm" (UID: "c9a391d0-68cc-4801-b1b2-40e572f5a934") : secret "dns-default-metrics-tls" not found Apr 23 17:59:21.144065 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:21.144028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert\") pod \"ingress-canary-9l5gk\" (UID: \"1c2b2631-b55c-4750-a693-fd0de3b3687f\") " pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 17:59:21.144456 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:21.144167 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:21.144456 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:21.144244 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert podName:1c2b2631-b55c-4750-a693-fd0de3b3687f nodeName:}" failed. No retries permitted until 2026-04-23 17:59:29.144227477 +0000 UTC m=+50.610822267 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert") pod "ingress-canary-9l5gk" (UID: "1c2b2631-b55c-4750-a693-fd0de3b3687f") : secret "canary-serving-cert" not found Apr 23 17:59:21.244750 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:21.244708 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:21.244905 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:21.244841 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:21.244905 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:21.244893 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls podName:c9a391d0-68cc-4801-b1b2-40e572f5a934 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:29.244879748 +0000 UTC m=+50.711474534 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls") pod "dns-default-wksgm" (UID: "c9a391d0-68cc-4801-b1b2-40e572f5a934") : secret "dns-default-metrics-tls" not found Apr 23 17:59:29.196663 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:29.196624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert\") pod \"ingress-canary-9l5gk\" (UID: \"1c2b2631-b55c-4750-a693-fd0de3b3687f\") " pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 17:59:29.197105 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:29.196725 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:29.197105 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:29.196785 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert podName:1c2b2631-b55c-4750-a693-fd0de3b3687f nodeName:}" failed. No retries permitted until 2026-04-23 17:59:45.196771191 +0000 UTC m=+66.663365977 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert") pod "ingress-canary-9l5gk" (UID: "1c2b2631-b55c-4750-a693-fd0de3b3687f") : secret "canary-serving-cert" not found Apr 23 17:59:29.297623 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:29.297580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:29.297760 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:29.297709 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:29.297803 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:29.297779 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls podName:c9a391d0-68cc-4801-b1b2-40e572f5a934 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:45.297760598 +0000 UTC m=+66.764355387 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls") pod "dns-default-wksgm" (UID: "c9a391d0-68cc-4801-b1b2-40e572f5a934") : secret "dns-default-metrics-tls" not found Apr 23 17:59:33.042907 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.042866 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll"] Apr 23 17:59:33.103083 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.103037 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll"] Apr 23 17:59:33.103083 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.103073 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf"] Apr 23 17:59:33.103281 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.103192 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll" Apr 23 17:59:33.105928 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.105904 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-924q6\"" Apr 23 17:59:33.106366 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.106348 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 17:59:33.106471 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.106353 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 17:59:33.106471 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.106353 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 17:59:33.107143 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.107125 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 17:59:33.120898 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.120872 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf"] Apr 23 17:59:33.121052 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.120996 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.123478 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.123453 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 17:59:33.123680 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.123517 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 17:59:33.123742 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.123727 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 17:59:33.123841 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.123825 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 17:59:33.127298 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.127268 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e09daf07-2394-49af-9049-d1adc1e44e79-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-559668968f-jjjll\" (UID: \"e09daf07-2394-49af-9049-d1adc1e44e79\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll" Apr 23 17:59:33.127396 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.127310 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwbft\" (UniqueName: \"kubernetes.io/projected/e09daf07-2394-49af-9049-d1adc1e44e79-kube-api-access-xwbft\") pod \"managed-serviceaccount-addon-agent-559668968f-jjjll\" (UID: \"e09daf07-2394-49af-9049-d1adc1e44e79\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll" Apr 23 17:59:33.127396 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.127389 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f7aec6d3-2c8c-419f-8098-c9a75301386d-hub\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.127482 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.127418 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f7aec6d3-2c8c-419f-8098-c9a75301386d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.127482 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.127444 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f7aec6d3-2c8c-419f-8098-c9a75301386d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.127482 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.127462 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f7aec6d3-2c8c-419f-8098-c9a75301386d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.127482 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.127478 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqggm\" (UniqueName: \"kubernetes.io/projected/f7aec6d3-2c8c-419f-8098-c9a75301386d-kube-api-access-gqggm\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.127655 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.127597 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f7aec6d3-2c8c-419f-8098-c9a75301386d-ca\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.228361 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.228328 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e09daf07-2394-49af-9049-d1adc1e44e79-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-559668968f-jjjll\" (UID: \"e09daf07-2394-49af-9049-d1adc1e44e79\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll" Apr 23 17:59:33.228569 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.228368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwbft\" (UniqueName: \"kubernetes.io/projected/e09daf07-2394-49af-9049-d1adc1e44e79-kube-api-access-xwbft\") pod \"managed-serviceaccount-addon-agent-559668968f-jjjll\" (UID: \"e09daf07-2394-49af-9049-d1adc1e44e79\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll" Apr 23 17:59:33.228569 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.228421 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f7aec6d3-2c8c-419f-8098-c9a75301386d-hub\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.228569 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.228443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f7aec6d3-2c8c-419f-8098-c9a75301386d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.228569 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.228466 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f7aec6d3-2c8c-419f-8098-c9a75301386d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.228782 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.228612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f7aec6d3-2c8c-419f-8098-c9a75301386d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.228782 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.228662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqggm\" (UniqueName: \"kubernetes.io/projected/f7aec6d3-2c8c-419f-8098-c9a75301386d-kube-api-access-gqggm\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.228782 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.228725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f7aec6d3-2c8c-419f-8098-c9a75301386d-ca\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.229193 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.229162 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f7aec6d3-2c8c-419f-8098-c9a75301386d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.231771 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.231747 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f7aec6d3-2c8c-419f-8098-c9a75301386d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.231855 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.231784 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f7aec6d3-2c8c-419f-8098-c9a75301386d-hub\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.231855 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.231815 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e09daf07-2394-49af-9049-d1adc1e44e79-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-559668968f-jjjll\" (UID: \"e09daf07-2394-49af-9049-d1adc1e44e79\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll" Apr 23 17:59:33.237201 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.237179 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqggm\" (UniqueName: \"kubernetes.io/projected/f7aec6d3-2c8c-419f-8098-c9a75301386d-kube-api-access-gqggm\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.237436 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.237419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwbft\" (UniqueName: \"kubernetes.io/projected/e09daf07-2394-49af-9049-d1adc1e44e79-kube-api-access-xwbft\") pod \"managed-serviceaccount-addon-agent-559668968f-jjjll\" (UID: \"e09daf07-2394-49af-9049-d1adc1e44e79\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll" Apr 23 17:59:33.240560 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.240513 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f7aec6d3-2c8c-419f-8098-c9a75301386d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.240780 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.240758 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f7aec6d3-2c8c-419f-8098-c9a75301386d-ca\") pod \"cluster-proxy-proxy-agent-5fd95bb885-7lctf\" (UID: \"f7aec6d3-2c8c-419f-8098-c9a75301386d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.424309 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.424214 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll" Apr 23 17:59:33.430194 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.430165 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 17:59:33.600881 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.600843 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf"] Apr 23 17:59:33.603052 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:33.603029 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll"] Apr 23 17:59:33.606212 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:59:33.606183 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode09daf07_2394_49af_9049_d1adc1e44e79.slice/crio-ac6a1d984e58269198983f6600d2c69eb7e3394d6706ee383193066448e5131e WatchSource:0}: Error finding container ac6a1d984e58269198983f6600d2c69eb7e3394d6706ee383193066448e5131e: Status 404 returned error can't find the container with id ac6a1d984e58269198983f6600d2c69eb7e3394d6706ee383193066448e5131e Apr 23 17:59:34.445286 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:34.445216 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll" event={"ID":"e09daf07-2394-49af-9049-d1adc1e44e79","Type":"ContainerStarted","Data":"ac6a1d984e58269198983f6600d2c69eb7e3394d6706ee383193066448e5131e"} Apr 23 17:59:34.446783 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:34.446731 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" event={"ID":"f7aec6d3-2c8c-419f-8098-c9a75301386d","Type":"ContainerStarted","Data":"6a4af02c9b81eafc3ad6ada4b9392946d88e94fb264048ce23929ee6bec53c13"} Apr 23 17:59:37.453028 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:37.452984 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll" event={"ID":"e09daf07-2394-49af-9049-d1adc1e44e79","Type":"ContainerStarted","Data":"439b66ebb024ab3fb5d9c2811c9ec5756da1c635fa807b06e10d50618e801167"} Apr 23 17:59:37.454271 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:37.454248 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" event={"ID":"f7aec6d3-2c8c-419f-8098-c9a75301386d","Type":"ContainerStarted","Data":"09f5565ef31ebc3f2ca9d1a7b967621cc8fb9e321d4fd7da909058b0b3ef3119"} Apr 23 17:59:37.471668 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:37.471625 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll" podStartSLOduration=1.1360499179999999 podStartE2EDuration="4.471611014s" podCreationTimestamp="2026-04-23 17:59:33 +0000 UTC" firstStartedPulling="2026-04-23 17:59:33.607897463 +0000 UTC m=+55.074492249" lastFinishedPulling="2026-04-23 17:59:36.943458551 +0000 UTC m=+58.410053345" observedRunningTime="2026-04-23 17:59:37.47081281 +0000 UTC m=+58.937407695" watchObservedRunningTime="2026-04-23 17:59:37.471611014 +0000 UTC m=+58.938205812" Apr 23 17:59:38.385489 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:38.385463 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xfxr2" Apr 23 17:59:39.460350 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:39.460255 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" event={"ID":"f7aec6d3-2c8c-419f-8098-c9a75301386d","Type":"ContainerStarted","Data":"bcbf78553da3f37c2df418bf1945841bf4f9eb78b9000b812a4696e7db3c35a1"} Apr 23 17:59:39.460350 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:39.460290 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" event={"ID":"f7aec6d3-2c8c-419f-8098-c9a75301386d","Type":"ContainerStarted","Data":"7c3a637247dcef9dc4dae38a080272ecb85897e5ed67065d624e7e4a22bffd99"} Apr 23 17:59:39.488161 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:39.488112 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" podStartSLOduration=0.918618603 podStartE2EDuration="6.488097055s" podCreationTimestamp="2026-04-23 17:59:33 +0000 UTC" firstStartedPulling="2026-04-23 17:59:33.606925831 +0000 UTC m=+55.073520621" lastFinishedPulling="2026-04-23 17:59:39.176404275 +0000 UTC m=+60.642999073" observedRunningTime="2026-04-23 17:59:39.485994985 +0000 UTC m=+60.952589792" watchObservedRunningTime="2026-04-23 17:59:39.488097055 +0000 UTC m=+60.954691901" Apr 23 17:59:44.819205 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:44.819160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9nzw\" (UniqueName: \"kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw\") pod \"network-check-target-8c8cd\" (UID: \"4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac\") " pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:59:44.822325 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:44.822303 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:59:44.832112 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:44.832085 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:59:44.842762 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:44.842737 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9nzw\" (UniqueName: \"kubernetes.io/projected/4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac-kube-api-access-s9nzw\") pod \"network-check-target-8c8cd\" (UID: \"4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac\") " pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:59:44.920140 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:44.920097 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs\") pod \"network-metrics-daemon-g6b5k\" (UID: \"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa\") " pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 17:59:44.922663 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:44.922644 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:59:44.930978 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:44.930941 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:59:44.931046 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:44.931025 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs podName:2bffdfc3-a546-4e7a-b9fd-46395fbcfffa nodeName:}" failed. No retries permitted until 2026-04-23 18:00:48.93100877 +0000 UTC m=+130.397603556 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs") pod "network-metrics-daemon-g6b5k" (UID: "2bffdfc3-a546-4e7a-b9fd-46395fbcfffa") : secret "metrics-daemon-secret" not found Apr 23 17:59:45.070448 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:45.070385 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qjtm5\"" Apr 23 17:59:45.077593 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:45.077568 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:59:45.190627 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:45.190596 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8c8cd"] Apr 23 17:59:45.194042 ip-10-0-128-229 kubenswrapper[2577]: W0423 17:59:45.194011 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b15bea0_c4e2_4c56_8a0b_1d0bf653c8ac.slice/crio-80648fd14c810c11657f2db8b0061a04280a71fa478853eab7e08f3a930559cf WatchSource:0}: Error finding container 80648fd14c810c11657f2db8b0061a04280a71fa478853eab7e08f3a930559cf: Status 404 returned error can't find the container with id 80648fd14c810c11657f2db8b0061a04280a71fa478853eab7e08f3a930559cf Apr 23 17:59:45.222789 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:45.222764 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert\") pod \"ingress-canary-9l5gk\" (UID: \"1c2b2631-b55c-4750-a693-fd0de3b3687f\") " pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 17:59:45.222874 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:45.222861 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:45.222924 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:45.222915 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert podName:1c2b2631-b55c-4750-a693-fd0de3b3687f nodeName:}" failed. No retries permitted until 2026-04-23 18:00:17.22290174 +0000 UTC m=+98.689496525 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert") pod "ingress-canary-9l5gk" (UID: "1c2b2631-b55c-4750-a693-fd0de3b3687f") : secret "canary-serving-cert" not found Apr 23 17:59:45.323377 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:45.323285 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 17:59:45.323508 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:45.323453 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:45.323582 ip-10-0-128-229 kubenswrapper[2577]: E0423 17:59:45.323522 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls podName:c9a391d0-68cc-4801-b1b2-40e572f5a934 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:17.323505535 +0000 UTC m=+98.790100321 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls") pod "dns-default-wksgm" (UID: "c9a391d0-68cc-4801-b1b2-40e572f5a934") : secret "dns-default-metrics-tls" not found Apr 23 17:59:45.474018 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:45.473985 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8c8cd" event={"ID":"4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac","Type":"ContainerStarted","Data":"80648fd14c810c11657f2db8b0061a04280a71fa478853eab7e08f3a930559cf"} Apr 23 17:59:48.481998 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:48.481959 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8c8cd" event={"ID":"4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac","Type":"ContainerStarted","Data":"e3a80fcb6250b402513ed3d4093c2b1ba56f602763a4fb6a78f63cc9016aae1f"} Apr 23 17:59:48.482405 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:48.482111 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 17:59:48.499163 ip-10-0-128-229 kubenswrapper[2577]: I0423 17:59:48.499112 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-8c8cd" podStartSLOduration=66.880628404 podStartE2EDuration="1m9.49909753s" podCreationTimestamp="2026-04-23 17:58:39 +0000 UTC" firstStartedPulling="2026-04-23 17:59:45.195944329 +0000 UTC m=+66.662539115" lastFinishedPulling="2026-04-23 17:59:47.814413453 +0000 UTC m=+69.281008241" observedRunningTime="2026-04-23 17:59:48.498308151 +0000 UTC m=+69.964902959" watchObservedRunningTime="2026-04-23 17:59:48.49909753 +0000 UTC m=+69.965692338" Apr 23 18:00:17.248120 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:00:17.248071 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert\") pod \"ingress-canary-9l5gk\" (UID: \"1c2b2631-b55c-4750-a693-fd0de3b3687f\") " pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 18:00:17.248666 ip-10-0-128-229 kubenswrapper[2577]: E0423 18:00:17.248173 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 18:00:17.248666 ip-10-0-128-229 kubenswrapper[2577]: E0423 18:00:17.248226 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert podName:1c2b2631-b55c-4750-a693-fd0de3b3687f nodeName:}" failed. No retries permitted until 2026-04-23 18:01:21.248212443 +0000 UTC m=+162.714807230 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert") pod "ingress-canary-9l5gk" (UID: "1c2b2631-b55c-4750-a693-fd0de3b3687f") : secret "canary-serving-cert" not found Apr 23 18:00:17.349439 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:00:17.349397 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 18:00:17.349628 ip-10-0-128-229 kubenswrapper[2577]: E0423 18:00:17.349570 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 18:00:17.349674 ip-10-0-128-229 kubenswrapper[2577]: E0423 18:00:17.349653 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls podName:c9a391d0-68cc-4801-b1b2-40e572f5a934 nodeName:}" failed. No retries permitted until 2026-04-23 18:01:21.349636594 +0000 UTC m=+162.816231381 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls") pod "dns-default-wksgm" (UID: "c9a391d0-68cc-4801-b1b2-40e572f5a934") : secret "dns-default-metrics-tls" not found Apr 23 18:00:19.486756 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:00:19.486724 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-8c8cd" Apr 23 18:00:44.508588 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:00:44.508558 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-799b7_39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20/dns-node-resolver/0.log" Apr 23 18:00:45.511634 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:00:45.511603 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9z6qm_6296d6a8-88ab-4cde-8f0b-ca707b8e5b51/node-ca/0.log" Apr 23 18:00:48.982408 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:00:48.982364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs\") pod \"network-metrics-daemon-g6b5k\" (UID: \"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa\") " pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 18:00:48.982838 ip-10-0-128-229 kubenswrapper[2577]: E0423 18:00:48.982505 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 18:00:48.982838 ip-10-0-128-229 kubenswrapper[2577]: E0423 18:00:48.982606 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs podName:2bffdfc3-a546-4e7a-b9fd-46395fbcfffa nodeName:}" failed. No retries permitted until 2026-04-23 18:02:50.982588648 +0000 UTC m=+252.449183435 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs") pod "network-metrics-daemon-g6b5k" (UID: "2bffdfc3-a546-4e7a-b9fd-46395fbcfffa") : secret "metrics-daemon-secret" not found Apr 23 18:01:04.780212 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.780171 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-pbckf"] Apr 23 18:01:04.783174 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.783152 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.786829 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.786802 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 18:01:04.787269 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.787249 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 18:01:04.787371 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.787326 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rmsv4\"" Apr 23 18:01:04.788063 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.788045 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 18:01:04.788130 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.788054 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 18:01:04.792967 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.792946 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pbckf"] Apr 23 18:01:04.797373 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.797354 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7ce505f9-904d-44b6-95ed-28d7a1f4556b-crio-socket\") pod \"insights-runtime-extractor-pbckf\" (UID: \"7ce505f9-904d-44b6-95ed-28d7a1f4556b\") " pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.797453 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.797395 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7ce505f9-904d-44b6-95ed-28d7a1f4556b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pbckf\" (UID: \"7ce505f9-904d-44b6-95ed-28d7a1f4556b\") " pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.797502 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.797480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ce505f9-904d-44b6-95ed-28d7a1f4556b-data-volume\") pod \"insights-runtime-extractor-pbckf\" (UID: \"7ce505f9-904d-44b6-95ed-28d7a1f4556b\") " pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.797568 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.797521 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cvrd\" (UniqueName: \"kubernetes.io/projected/7ce505f9-904d-44b6-95ed-28d7a1f4556b-kube-api-access-2cvrd\") pod \"insights-runtime-extractor-pbckf\" (UID: \"7ce505f9-904d-44b6-95ed-28d7a1f4556b\") " pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.797607 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.797573 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7ce505f9-904d-44b6-95ed-28d7a1f4556b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pbckf\" (UID: \"7ce505f9-904d-44b6-95ed-28d7a1f4556b\") " pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.855886 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.855850 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7d58df7478-pcpd9"] Apr 23 18:01:04.858692 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.858676 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.861472 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.861451 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 18:01:04.861729 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.861703 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 18:01:04.861875 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.861830 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jxfcc\"" Apr 23 18:01:04.861982 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.861939 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 18:01:04.866722 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.866694 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 18:01:04.868342 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.868321 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7d58df7478-pcpd9"] Apr 23 18:01:04.898195 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.898162 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1f4e671d-2e11-4c0d-af89-5265ece89a36-ca-trust-extracted\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.898195 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.898205 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f4e671d-2e11-4c0d-af89-5265ece89a36-trusted-ca\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.898454 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.898234 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1f4e671d-2e11-4c0d-af89-5265ece89a36-registry-certificates\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.898454 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.898295 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7ce505f9-904d-44b6-95ed-28d7a1f4556b-crio-socket\") pod \"insights-runtime-extractor-pbckf\" (UID: \"7ce505f9-904d-44b6-95ed-28d7a1f4556b\") " pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.898454 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.898319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f4e671d-2e11-4c0d-af89-5265ece89a36-registry-tls\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.898454 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.898359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7ce505f9-904d-44b6-95ed-28d7a1f4556b-crio-socket\") pod \"insights-runtime-extractor-pbckf\" (UID: \"7ce505f9-904d-44b6-95ed-28d7a1f4556b\") " pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.898454 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.898362 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7ce505f9-904d-44b6-95ed-28d7a1f4556b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pbckf\" (UID: \"7ce505f9-904d-44b6-95ed-28d7a1f4556b\") " pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.898454 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.898410 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1f4e671d-2e11-4c0d-af89-5265ece89a36-image-registry-private-configuration\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.898454 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.898430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f4e671d-2e11-4c0d-af89-5265ece89a36-bound-sa-token\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.898822 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.898481 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ce505f9-904d-44b6-95ed-28d7a1f4556b-data-volume\") pod \"insights-runtime-extractor-pbckf\" (UID: \"7ce505f9-904d-44b6-95ed-28d7a1f4556b\") " pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.898822 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.898497 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cvrd\" (UniqueName: \"kubernetes.io/projected/7ce505f9-904d-44b6-95ed-28d7a1f4556b-kube-api-access-2cvrd\") pod \"insights-runtime-extractor-pbckf\" (UID: \"7ce505f9-904d-44b6-95ed-28d7a1f4556b\") " pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.898822 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.898515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7ce505f9-904d-44b6-95ed-28d7a1f4556b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pbckf\" (UID: \"7ce505f9-904d-44b6-95ed-28d7a1f4556b\") " pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.898822 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.898553 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1f4e671d-2e11-4c0d-af89-5265ece89a36-installation-pull-secrets\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.898822 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.898570 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqdvd\" (UniqueName: \"kubernetes.io/projected/1f4e671d-2e11-4c0d-af89-5265ece89a36-kube-api-access-qqdvd\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.898822 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.898798 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ce505f9-904d-44b6-95ed-28d7a1f4556b-data-volume\") pod \"insights-runtime-extractor-pbckf\" (UID: \"7ce505f9-904d-44b6-95ed-28d7a1f4556b\") " pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.899145 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.899005 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7ce505f9-904d-44b6-95ed-28d7a1f4556b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pbckf\" (UID: \"7ce505f9-904d-44b6-95ed-28d7a1f4556b\") " pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.901064 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.901048 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7ce505f9-904d-44b6-95ed-28d7a1f4556b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pbckf\" (UID: \"7ce505f9-904d-44b6-95ed-28d7a1f4556b\") " pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.910429 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.910396 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cvrd\" (UniqueName: \"kubernetes.io/projected/7ce505f9-904d-44b6-95ed-28d7a1f4556b-kube-api-access-2cvrd\") pod \"insights-runtime-extractor-pbckf\" (UID: \"7ce505f9-904d-44b6-95ed-28d7a1f4556b\") " pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:04.998838 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.998801 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f4e671d-2e11-4c0d-af89-5265ece89a36-registry-tls\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.999036 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.998868 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1f4e671d-2e11-4c0d-af89-5265ece89a36-image-registry-private-configuration\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.999036 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.998896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f4e671d-2e11-4c0d-af89-5265ece89a36-bound-sa-token\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.999036 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.998948 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1f4e671d-2e11-4c0d-af89-5265ece89a36-installation-pull-secrets\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.999036 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.998975 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqdvd\" (UniqueName: \"kubernetes.io/projected/1f4e671d-2e11-4c0d-af89-5265ece89a36-kube-api-access-qqdvd\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.999036 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.999006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1f4e671d-2e11-4c0d-af89-5265ece89a36-ca-trust-extracted\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.999036 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.999030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f4e671d-2e11-4c0d-af89-5265ece89a36-trusted-ca\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.999330 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.999059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1f4e671d-2e11-4c0d-af89-5265ece89a36-registry-certificates\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:04.999563 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:04.999467 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1f4e671d-2e11-4c0d-af89-5265ece89a36-ca-trust-extracted\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:05.000231 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.000197 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f4e671d-2e11-4c0d-af89-5265ece89a36-trusted-ca\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:05.000338 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.000309 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1f4e671d-2e11-4c0d-af89-5265ece89a36-registry-certificates\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:05.001790 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.001765 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1f4e671d-2e11-4c0d-af89-5265ece89a36-installation-pull-secrets\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:05.001874 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.001811 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f4e671d-2e11-4c0d-af89-5265ece89a36-registry-tls\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:05.001874 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.001851 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1f4e671d-2e11-4c0d-af89-5265ece89a36-image-registry-private-configuration\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:05.008032 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.008009 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f4e671d-2e11-4c0d-af89-5265ece89a36-bound-sa-token\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:05.008325 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.008302 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqdvd\" (UniqueName: \"kubernetes.io/projected/1f4e671d-2e11-4c0d-af89-5265ece89a36-kube-api-access-qqdvd\") pod \"image-registry-7d58df7478-pcpd9\" (UID: \"1f4e671d-2e11-4c0d-af89-5265ece89a36\") " pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:05.092729 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.092639 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pbckf" Apr 23 18:01:05.167327 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.167291 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:05.213655 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.213625 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pbckf"] Apr 23 18:01:05.219445 ip-10-0-128-229 kubenswrapper[2577]: W0423 18:01:05.219410 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ce505f9_904d_44b6_95ed_28d7a1f4556b.slice/crio-848d036ece6554daea3607455268c09409ab2a9eb1a48a3d1134177bba725482 WatchSource:0}: Error finding container 848d036ece6554daea3607455268c09409ab2a9eb1a48a3d1134177bba725482: Status 404 returned error can't find the container with id 848d036ece6554daea3607455268c09409ab2a9eb1a48a3d1134177bba725482 Apr 23 18:01:05.297328 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.297290 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7d58df7478-pcpd9"] Apr 23 18:01:05.301594 ip-10-0-128-229 kubenswrapper[2577]: W0423 18:01:05.301561 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f4e671d_2e11_4c0d_af89_5265ece89a36.slice/crio-016227a7e630a998871d050ced01818e56ec3e3468d3c580b71d22d6c41e3e9c WatchSource:0}: Error finding container 016227a7e630a998871d050ced01818e56ec3e3468d3c580b71d22d6c41e3e9c: Status 404 returned error can't find the container with id 016227a7e630a998871d050ced01818e56ec3e3468d3c580b71d22d6c41e3e9c Apr 23 18:01:05.669556 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.669445 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pbckf" event={"ID":"7ce505f9-904d-44b6-95ed-28d7a1f4556b","Type":"ContainerStarted","Data":"3cecd34ce784ef913879c79e4276bd165de0e313b5c5bf9006cc83459485318f"} Apr 23 18:01:05.669556 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.669495 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pbckf" event={"ID":"7ce505f9-904d-44b6-95ed-28d7a1f4556b","Type":"ContainerStarted","Data":"848d036ece6554daea3607455268c09409ab2a9eb1a48a3d1134177bba725482"} Apr 23 18:01:05.670664 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.670637 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" event={"ID":"1f4e671d-2e11-4c0d-af89-5265ece89a36","Type":"ContainerStarted","Data":"879da2011f5be13ffb076790320a0562d2bdd3c264610385d6832e488891aeec"} Apr 23 18:01:05.670797 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.670666 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" event={"ID":"1f4e671d-2e11-4c0d-af89-5265ece89a36","Type":"ContainerStarted","Data":"016227a7e630a998871d050ced01818e56ec3e3468d3c580b71d22d6c41e3e9c"} Apr 23 18:01:05.670797 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.670772 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:05.690925 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:05.690522 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" podStartSLOduration=1.69050234 podStartE2EDuration="1.69050234s" podCreationTimestamp="2026-04-23 18:01:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:01:05.689490243 +0000 UTC m=+147.156085045" watchObservedRunningTime="2026-04-23 18:01:05.69050234 +0000 UTC m=+147.157097149" Apr 23 18:01:06.674878 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:06.674831 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pbckf" event={"ID":"7ce505f9-904d-44b6-95ed-28d7a1f4556b","Type":"ContainerStarted","Data":"510ece29ccc3e8e851ee7d4e90aef813062a5e6d07b3afcac1354eed7671934e"} Apr 23 18:01:07.678368 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:07.678274 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pbckf" event={"ID":"7ce505f9-904d-44b6-95ed-28d7a1f4556b","Type":"ContainerStarted","Data":"4fe6d36edf6aedb7df916f76d04a6d4c08b25b11eab4c48362d682e47eddbaca"} Apr 23 18:01:07.696971 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:07.696915 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-pbckf" podStartSLOduration=1.681113888 podStartE2EDuration="3.696900224s" podCreationTimestamp="2026-04-23 18:01:04 +0000 UTC" firstStartedPulling="2026-04-23 18:01:05.279605061 +0000 UTC m=+146.746199856" lastFinishedPulling="2026-04-23 18:01:07.295391406 +0000 UTC m=+148.761986192" observedRunningTime="2026-04-23 18:01:07.696148041 +0000 UTC m=+149.162742850" watchObservedRunningTime="2026-04-23 18:01:07.696900224 +0000 UTC m=+149.163495071" Apr 23 18:01:13.431675 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:13.431609 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" podUID="f7aec6d3-2c8c-419f-8098-c9a75301386d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 18:01:16.405849 ip-10-0-128-229 kubenswrapper[2577]: E0423 18:01:16.405799 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9l5gk" podUID="1c2b2631-b55c-4750-a693-fd0de3b3687f" Apr 23 18:01:16.420037 ip-10-0-128-229 kubenswrapper[2577]: E0423 18:01:16.419997 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-wksgm" podUID="c9a391d0-68cc-4801-b1b2-40e572f5a934" Apr 23 18:01:16.701025 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:16.700943 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 18:01:17.172938 ip-10-0-128-229 kubenswrapper[2577]: E0423 18:01:17.172898 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-g6b5k" podUID="2bffdfc3-a546-4e7a-b9fd-46395fbcfffa" Apr 23 18:01:20.328284 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.328243 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zf54l"] Apr 23 18:01:20.331582 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.331561 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9vxzd"] Apr 23 18:01:20.331726 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.331708 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.334475 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.334450 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 18:01:20.334632 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.334484 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 18:01:20.334732 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.334708 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 18:01:20.334795 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.334785 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.334974 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.334850 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 18:01:20.335083 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.335065 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 18:01:20.335599 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.335584 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 18:01:20.335690 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.335618 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sclw6\"" Apr 23 18:01:20.337161 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.337142 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-75455\"" Apr 23 18:01:20.337314 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.337212 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 18:01:20.337314 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.337253 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 18:01:20.337524 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.337502 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 18:01:20.343738 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.343717 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9vxzd"] Apr 23 18:01:20.408473 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.408434 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8w2d\" (UniqueName: \"kubernetes.io/projected/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-kube-api-access-k8w2d\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.408677 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.408495 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.408677 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.408618 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.408677 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.408655 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-tls\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.408799 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.408681 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8d941eca-ba33-4a26-b5bf-855b863a5e16-root\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.408799 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.408701 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-wtmp\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.408799 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.408729 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-accelerators-collector-config\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.408799 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.408753 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.408799 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.408770 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.408799 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.408788 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.408996 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.408820 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-textfile\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.408996 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.408849 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d941eca-ba33-4a26-b5bf-855b863a5e16-metrics-client-ca\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.408996 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.408916 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.408996 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.408942 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d941eca-ba33-4a26-b5bf-855b863a5e16-sys\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.408996 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.408970 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk4qt\" (UniqueName: \"kubernetes.io/projected/8d941eca-ba33-4a26-b5bf-855b863a5e16-kube-api-access-pk4qt\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.509992 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.509941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.509992 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.509991 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-tls\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.510214 ip-10-0-128-229 kubenswrapper[2577]: E0423 18:01:20.510081 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 18:01:20.510214 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510109 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8d941eca-ba33-4a26-b5bf-855b863a5e16-root\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.510214 ip-10-0-128-229 kubenswrapper[2577]: E0423 18:01:20.510147 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-tls podName:8d941eca-ba33-4a26-b5bf-855b863a5e16 nodeName:}" failed. No retries permitted until 2026-04-23 18:01:21.010130208 +0000 UTC m=+162.476724994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-tls") pod "node-exporter-zf54l" (UID: "8d941eca-ba33-4a26-b5bf-855b863a5e16") : secret "node-exporter-tls" not found Apr 23 18:01:20.510214 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510164 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8d941eca-ba33-4a26-b5bf-855b863a5e16-root\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.510214 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-wtmp\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.510214 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-accelerators-collector-config\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.510552 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.510552 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510238 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.510552 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510265 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.510552 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510321 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-textfile\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.510552 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510349 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d941eca-ba33-4a26-b5bf-855b863a5e16-metrics-client-ca\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.510552 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510382 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.510552 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-wtmp\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.510552 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510477 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d941eca-ba33-4a26-b5bf-855b863a5e16-sys\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.510552 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d941eca-ba33-4a26-b5bf-855b863a5e16-sys\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.510552 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk4qt\" (UniqueName: \"kubernetes.io/projected/8d941eca-ba33-4a26-b5bf-855b863a5e16-kube-api-access-pk4qt\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.511015 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510614 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8w2d\" (UniqueName: \"kubernetes.io/projected/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-kube-api-access-k8w2d\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.511015 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510642 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.511015 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.511015 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510721 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-textfile\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.511015 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510805 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.511015 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510894 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-accelerators-collector-config\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.511015 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.510992 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.511546 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.511255 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d941eca-ba33-4a26-b5bf-855b863a5e16-metrics-client-ca\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.513095 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.513066 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.513196 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.513139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.513196 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.513175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.519696 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.519675 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk4qt\" (UniqueName: \"kubernetes.io/projected/8d941eca-ba33-4a26-b5bf-855b863a5e16-kube-api-access-pk4qt\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:20.520686 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.520667 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8w2d\" (UniqueName: \"kubernetes.io/projected/0f66e4e5-dba6-43a8-b089-e01f3cb09e8d-kube-api-access-k8w2d\") pod \"kube-state-metrics-69db897b98-9vxzd\" (UID: \"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.648113 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.648009 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" Apr 23 18:01:20.777561 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:20.777507 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9vxzd"] Apr 23 18:01:20.780104 ip-10-0-128-229 kubenswrapper[2577]: W0423 18:01:20.780079 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f66e4e5_dba6_43a8_b089_e01f3cb09e8d.slice/crio-69055e153b8ba4b8f27899581490319f335fc9777ad7dff10948c92c4e79c2db WatchSource:0}: Error finding container 69055e153b8ba4b8f27899581490319f335fc9777ad7dff10948c92c4e79c2db: Status 404 returned error can't find the container with id 69055e153b8ba4b8f27899581490319f335fc9777ad7dff10948c92c4e79c2db Apr 23 18:01:21.016251 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:21.016157 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-tls\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:21.018585 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:21.018568 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8d941eca-ba33-4a26-b5bf-855b863a5e16-node-exporter-tls\") pod \"node-exporter-zf54l\" (UID: \"8d941eca-ba33-4a26-b5bf-855b863a5e16\") " pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:21.241600 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:21.241568 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zf54l" Apr 23 18:01:21.249512 ip-10-0-128-229 kubenswrapper[2577]: W0423 18:01:21.249478 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d941eca_ba33_4a26_b5bf_855b863a5e16.slice/crio-35bcc7ed42414744d8bdb3cf6a60c7fa2dc18a62603e9e23c9ae6ee4764caa74 WatchSource:0}: Error finding container 35bcc7ed42414744d8bdb3cf6a60c7fa2dc18a62603e9e23c9ae6ee4764caa74: Status 404 returned error can't find the container with id 35bcc7ed42414744d8bdb3cf6a60c7fa2dc18a62603e9e23c9ae6ee4764caa74 Apr 23 18:01:21.318548 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:21.318490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert\") pod \"ingress-canary-9l5gk\" (UID: \"1c2b2631-b55c-4750-a693-fd0de3b3687f\") " pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 18:01:21.321007 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:21.320978 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c2b2631-b55c-4750-a693-fd0de3b3687f-cert\") pod \"ingress-canary-9l5gk\" (UID: \"1c2b2631-b55c-4750-a693-fd0de3b3687f\") " pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 18:01:21.419334 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:21.419295 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 18:01:21.421841 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:21.421817 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9a391d0-68cc-4801-b1b2-40e572f5a934-metrics-tls\") pod \"dns-default-wksgm\" (UID: \"c9a391d0-68cc-4801-b1b2-40e572f5a934\") " pod="openshift-dns/dns-default-wksgm" Apr 23 18:01:21.504222 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:21.504183 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tsbqj\"" Apr 23 18:01:21.512481 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:21.512457 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9l5gk" Apr 23 18:01:21.653930 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:21.653864 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9l5gk"] Apr 23 18:01:21.659019 ip-10-0-128-229 kubenswrapper[2577]: W0423 18:01:21.658981 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c2b2631_b55c_4750_a693_fd0de3b3687f.slice/crio-ca0d18ba1ea100a3d132f0e2f6b234e3251ac22977d9b6cfe2ff8cba63d57b46 WatchSource:0}: Error finding container ca0d18ba1ea100a3d132f0e2f6b234e3251ac22977d9b6cfe2ff8cba63d57b46: Status 404 returned error can't find the container with id ca0d18ba1ea100a3d132f0e2f6b234e3251ac22977d9b6cfe2ff8cba63d57b46 Apr 23 18:01:21.714071 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:21.714031 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" event={"ID":"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d","Type":"ContainerStarted","Data":"69055e153b8ba4b8f27899581490319f335fc9777ad7dff10948c92c4e79c2db"} Apr 23 18:01:21.715301 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:21.715264 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zf54l" event={"ID":"8d941eca-ba33-4a26-b5bf-855b863a5e16","Type":"ContainerStarted","Data":"35bcc7ed42414744d8bdb3cf6a60c7fa2dc18a62603e9e23c9ae6ee4764caa74"} Apr 23 18:01:21.716391 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:21.716361 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9l5gk" event={"ID":"1c2b2631-b55c-4750-a693-fd0de3b3687f","Type":"ContainerStarted","Data":"ca0d18ba1ea100a3d132f0e2f6b234e3251ac22977d9b6cfe2ff8cba63d57b46"} Apr 23 18:01:22.720517 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:22.720478 2577 generic.go:358] "Generic (PLEG): container finished" podID="8d941eca-ba33-4a26-b5bf-855b863a5e16" containerID="a30b752b0edd00417085b1c6aa24f856b463b07995a64b8e507f35c303bc8530" exitCode=0 Apr 23 18:01:22.720954 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:22.720569 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zf54l" event={"ID":"8d941eca-ba33-4a26-b5bf-855b863a5e16","Type":"ContainerDied","Data":"a30b752b0edd00417085b1c6aa24f856b463b07995a64b8e507f35c303bc8530"} Apr 23 18:01:22.722883 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:22.722843 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" event={"ID":"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d","Type":"ContainerStarted","Data":"be477866d1b8536b4eb84a848777de0bf934bab1a480f23dfebc8ad63d423e96"} Apr 23 18:01:22.723005 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:22.722894 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" event={"ID":"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d","Type":"ContainerStarted","Data":"78f9bed489148e90d43950e1a0c5a4bbb596b2d982587522b8b33dd3c37db5fe"} Apr 23 18:01:22.723005 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:22.722909 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" event={"ID":"0f66e4e5-dba6-43a8-b089-e01f3cb09e8d","Type":"ContainerStarted","Data":"742ad8dca5a061e109ee882c809f56bf73e6137e6f38a432ab86de4ab54f9b35"} Apr 23 18:01:22.770618 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:22.770563 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-9vxzd" podStartSLOduration=1.264371862 podStartE2EDuration="2.770518744s" podCreationTimestamp="2026-04-23 18:01:20 +0000 UTC" firstStartedPulling="2026-04-23 18:01:20.781914083 +0000 UTC m=+162.248508869" lastFinishedPulling="2026-04-23 18:01:22.288060962 +0000 UTC m=+163.754655751" observedRunningTime="2026-04-23 18:01:22.769224769 +0000 UTC m=+164.235819574" watchObservedRunningTime="2026-04-23 18:01:22.770518744 +0000 UTC m=+164.237113546" Apr 23 18:01:23.432052 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:23.432009 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" podUID="f7aec6d3-2c8c-419f-8098-c9a75301386d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 18:01:23.726464 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:23.726365 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9l5gk" event={"ID":"1c2b2631-b55c-4750-a693-fd0de3b3687f","Type":"ContainerStarted","Data":"c5e00ce3e36f80daacf59b6530d9858b164b077080ea5f555d28906b70e02953"} Apr 23 18:01:23.728346 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:23.728318 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zf54l" event={"ID":"8d941eca-ba33-4a26-b5bf-855b863a5e16","Type":"ContainerStarted","Data":"97fe7b435b731396105ef700d2593acb1cedf32641a569040628c9c151b4b7e6"} Apr 23 18:01:23.728455 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:23.728356 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zf54l" event={"ID":"8d941eca-ba33-4a26-b5bf-855b863a5e16","Type":"ContainerStarted","Data":"551b11ad4ff2148e3745adaec0e78e027b8a925a53c034e353fd878bba29b151"} Apr 23 18:01:23.743429 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:23.743381 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9l5gk" podStartSLOduration=128.836546801 podStartE2EDuration="2m10.743367462s" podCreationTimestamp="2026-04-23 17:59:13 +0000 UTC" firstStartedPulling="2026-04-23 18:01:21.66141686 +0000 UTC m=+163.128011661" lastFinishedPulling="2026-04-23 18:01:23.568237532 +0000 UTC m=+165.034832322" observedRunningTime="2026-04-23 18:01:23.742341914 +0000 UTC m=+165.208936723" watchObservedRunningTime="2026-04-23 18:01:23.743367462 +0000 UTC m=+165.209962271" Apr 23 18:01:23.760214 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:23.760156 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zf54l" podStartSLOduration=2.721075483 podStartE2EDuration="3.760136123s" podCreationTimestamp="2026-04-23 18:01:20 +0000 UTC" firstStartedPulling="2026-04-23 18:01:21.251108 +0000 UTC m=+162.717702786" lastFinishedPulling="2026-04-23 18:01:22.290168622 +0000 UTC m=+163.756763426" observedRunningTime="2026-04-23 18:01:23.759474803 +0000 UTC m=+165.226069607" watchObservedRunningTime="2026-04-23 18:01:23.760136123 +0000 UTC m=+165.226730932" Apr 23 18:01:25.172026 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:25.171994 2577 patch_prober.go:28] interesting pod/image-registry-7d58df7478-pcpd9 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 18:01:25.172480 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:25.172044 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" podUID="1f4e671d-2e11-4c0d-af89-5265ece89a36" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:01:26.625673 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.625634 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 18:01:26.634053 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.634025 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.636767 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.636736 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 18:01:26.636767 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.636753 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 18:01:26.636767 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.636737 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 18:01:26.637006 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.636805 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 18:01:26.637006 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.636805 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 18:01:26.637181 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.637165 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 18:01:26.637903 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.637883 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 18:01:26.637991 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.637904 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4l81noppmbj15\"" Apr 23 18:01:26.637991 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.637965 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 18:01:26.637991 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.637908 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 18:01:26.638186 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.638174 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-wjpt5\"" Apr 23 18:01:26.638670 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.638650 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 18:01:26.638755 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.638693 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 18:01:26.641458 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.641311 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 18:01:26.643294 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.643277 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 18:01:26.645706 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.645685 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 18:01:26.661801 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.661766 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.661988 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.661808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa80070a-0a95-430b-b355-193722f7759c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.661988 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.661836 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa80070a-0a95-430b-b355-193722f7759c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.661988 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.661913 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-web-config\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.661988 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.661943 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa80070a-0a95-430b-b355-193722f7759c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.661988 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.661978 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.662171 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.662052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-config\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.662171 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.662087 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa80070a-0a95-430b-b355-193722f7759c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.662171 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.662104 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa80070a-0a95-430b-b355-193722f7759c-config-out\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.662171 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.662121 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.662171 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.662145 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57gkh\" (UniqueName: \"kubernetes.io/projected/aa80070a-0a95-430b-b355-193722f7759c-kube-api-access-57gkh\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.662171 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.662165 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.662348 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.662200 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.662348 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.662256 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aa80070a-0a95-430b-b355-193722f7759c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.662348 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.662287 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aa80070a-0a95-430b-b355-193722f7759c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.662348 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.662316 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.662348 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.662338 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.662492 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.662382 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa80070a-0a95-430b-b355-193722f7759c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.679813 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.679782 2577 patch_prober.go:28] interesting pod/image-registry-7d58df7478-pcpd9 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 18:01:26.679958 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.679831 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" podUID="1f4e671d-2e11-4c0d-af89-5265ece89a36" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:01:26.763141 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aa80070a-0a95-430b-b355-193722f7759c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.763335 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763149 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.763335 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763180 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.763335 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa80070a-0a95-430b-b355-193722f7759c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.763335 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763239 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.763335 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763266 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa80070a-0a95-430b-b355-193722f7759c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.763589 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763431 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa80070a-0a95-430b-b355-193722f7759c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.763589 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-web-config\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.763589 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763523 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa80070a-0a95-430b-b355-193722f7759c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.763589 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763575 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.763771 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-config\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.763771 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763635 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa80070a-0a95-430b-b355-193722f7759c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.763771 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763658 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa80070a-0a95-430b-b355-193722f7759c-config-out\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.763771 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763685 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.763771 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763710 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57gkh\" (UniqueName: \"kubernetes.io/projected/aa80070a-0a95-430b-b355-193722f7759c-kube-api-access-57gkh\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.763771 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763743 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.764068 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763772 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.764068 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.763838 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aa80070a-0a95-430b-b355-193722f7759c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.764335 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.764209 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa80070a-0a95-430b-b355-193722f7759c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.764438 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.764361 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa80070a-0a95-430b-b355-193722f7759c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.764710 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.764683 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aa80070a-0a95-430b-b355-193722f7759c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.765379 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.765059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa80070a-0a95-430b-b355-193722f7759c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.765379 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.765080 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa80070a-0a95-430b-b355-193722f7759c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.767960 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.767552 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa80070a-0a95-430b-b355-193722f7759c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.767960 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.767739 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa80070a-0a95-430b-b355-193722f7759c-config-out\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.767960 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.767791 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aa80070a-0a95-430b-b355-193722f7759c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.767960 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.767793 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.768206 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.768193 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.768686 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.768652 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-web-config\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.768686 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.768660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.769374 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.769352 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-config\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.769423 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.769403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.769828 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.769813 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.770271 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.770248 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.770372 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.770355 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aa80070a-0a95-430b-b355-193722f7759c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.775700 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.775678 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57gkh\" (UniqueName: \"kubernetes.io/projected/aa80070a-0a95-430b-b355-193722f7759c-kube-api-access-57gkh\") pod \"prometheus-k8s-0\" (UID: \"aa80070a-0a95-430b-b355-193722f7759c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:26.944016 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:26.943932 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:27.078508 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:27.078315 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 18:01:27.082877 ip-10-0-128-229 kubenswrapper[2577]: W0423 18:01:27.082840 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa80070a_0a95_430b_b355_193722f7759c.slice/crio-4a3be70e5e07d3ad90bc7ce0d00b9b3923279e649e2d5e95bd0f7964d459d90b WatchSource:0}: Error finding container 4a3be70e5e07d3ad90bc7ce0d00b9b3923279e649e2d5e95bd0f7964d459d90b: Status 404 returned error can't find the container with id 4a3be70e5e07d3ad90bc7ce0d00b9b3923279e649e2d5e95bd0f7964d459d90b Apr 23 18:01:27.742750 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:27.742714 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa80070a-0a95-430b-b355-193722f7759c","Type":"ContainerStarted","Data":"4a3be70e5e07d3ad90bc7ce0d00b9b3923279e649e2d5e95bd0f7964d459d90b"} Apr 23 18:01:28.158048 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:28.157890 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wksgm" Apr 23 18:01:28.160660 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:28.160639 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-sv5q9\"" Apr 23 18:01:28.168951 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:28.168933 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wksgm" Apr 23 18:01:28.291343 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:28.291313 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wksgm"] Apr 23 18:01:28.297377 ip-10-0-128-229 kubenswrapper[2577]: W0423 18:01:28.297355 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9a391d0_68cc_4801_b1b2_40e572f5a934.slice/crio-4f5fce6740f8337ec33f36c6813742a785ded57dcaa533df1958a70ff1530149 WatchSource:0}: Error finding container 4f5fce6740f8337ec33f36c6813742a785ded57dcaa533df1958a70ff1530149: Status 404 returned error can't find the container with id 4f5fce6740f8337ec33f36c6813742a785ded57dcaa533df1958a70ff1530149 Apr 23 18:01:28.746907 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:28.746867 2577 generic.go:358] "Generic (PLEG): container finished" podID="aa80070a-0a95-430b-b355-193722f7759c" containerID="bd73ca65705c1af19a8c4fb3713ca55dcc836340205da257d6a5b545bc4e1699" exitCode=0 Apr 23 18:01:28.747354 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:28.746961 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa80070a-0a95-430b-b355-193722f7759c","Type":"ContainerDied","Data":"bd73ca65705c1af19a8c4fb3713ca55dcc836340205da257d6a5b545bc4e1699"} Apr 23 18:01:28.748352 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:28.748325 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wksgm" event={"ID":"c9a391d0-68cc-4801-b1b2-40e572f5a934","Type":"ContainerStarted","Data":"4f5fce6740f8337ec33f36c6813742a785ded57dcaa533df1958a70ff1530149"} Apr 23 18:01:29.752674 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:29.752594 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wksgm" event={"ID":"c9a391d0-68cc-4801-b1b2-40e572f5a934","Type":"ContainerStarted","Data":"be839abeb46f3d1b7951ad3ab7a3c773f5d9fc8a6cb29560fcea6f861bedbc64"} Apr 23 18:01:29.752674 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:29.752634 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wksgm" event={"ID":"c9a391d0-68cc-4801-b1b2-40e572f5a934","Type":"ContainerStarted","Data":"ea375aebd3b961a5850a7e1e28f2e60326ec9cefe0dd5cb2e8465b6ab5be6561"} Apr 23 18:01:29.753131 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:29.752753 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wksgm" Apr 23 18:01:29.770699 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:29.770635 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wksgm" podStartSLOduration=135.599516931 podStartE2EDuration="2m16.770615043s" podCreationTimestamp="2026-04-23 17:59:13 +0000 UTC" firstStartedPulling="2026-04-23 18:01:28.299125914 +0000 UTC m=+169.765720701" lastFinishedPulling="2026-04-23 18:01:29.470224021 +0000 UTC m=+170.936818813" observedRunningTime="2026-04-23 18:01:29.77057002 +0000 UTC m=+171.237164827" watchObservedRunningTime="2026-04-23 18:01:29.770615043 +0000 UTC m=+171.237209852" Apr 23 18:01:30.157755 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:30.157721 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 18:01:32.763777 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:32.763734 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa80070a-0a95-430b-b355-193722f7759c","Type":"ContainerStarted","Data":"c6ad84cc61e8d515a07ad2994e314b27a73504bd0716a1821ac8ff06e4d8d405"} Apr 23 18:01:32.763777 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:32.763781 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa80070a-0a95-430b-b355-193722f7759c","Type":"ContainerStarted","Data":"4ebe60862fdb712a313197fa32d46031b6f3f9d57ccf7280d28d6999dfb27cec"} Apr 23 18:01:33.431107 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:33.431076 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" podUID="f7aec6d3-2c8c-419f-8098-c9a75301386d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 18:01:33.431221 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:33.431138 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" Apr 23 18:01:33.431642 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:33.431611 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"bcbf78553da3f37c2df418bf1945841bf4f9eb78b9000b812a4696e7db3c35a1"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 23 18:01:33.431687 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:33.431676 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" podUID="f7aec6d3-2c8c-419f-8098-c9a75301386d" containerName="service-proxy" containerID="cri-o://bcbf78553da3f37c2df418bf1945841bf4f9eb78b9000b812a4696e7db3c35a1" gracePeriod=30 Apr 23 18:01:33.768978 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:33.768941 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7aec6d3-2c8c-419f-8098-c9a75301386d" containerID="bcbf78553da3f37c2df418bf1945841bf4f9eb78b9000b812a4696e7db3c35a1" exitCode=2 Apr 23 18:01:33.769410 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:33.769012 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" event={"ID":"f7aec6d3-2c8c-419f-8098-c9a75301386d","Type":"ContainerDied","Data":"bcbf78553da3f37c2df418bf1945841bf4f9eb78b9000b812a4696e7db3c35a1"} Apr 23 18:01:33.769410 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:33.769052 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fd95bb885-7lctf" event={"ID":"f7aec6d3-2c8c-419f-8098-c9a75301386d","Type":"ContainerStarted","Data":"807b08a6cb4a4380b6684185451b53f32dc9c84f5650ebf8a2b61d8ee1d3deac"} Apr 23 18:01:33.772312 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:33.772288 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa80070a-0a95-430b-b355-193722f7759c","Type":"ContainerStarted","Data":"a396a221c22d102ce3ac75af394fc42bad5e3a42885cf1037061edf51c34954e"} Apr 23 18:01:33.772430 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:33.772316 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa80070a-0a95-430b-b355-193722f7759c","Type":"ContainerStarted","Data":"c9b576ab5c73f25ca44258cf7dd6e3be9a854e45e59f6bd67cdaf795a68e7f0b"} Apr 23 18:01:33.772430 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:33.772325 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa80070a-0a95-430b-b355-193722f7759c","Type":"ContainerStarted","Data":"860a099bc7e0f7fc7e59da5e49ec7210810fe9a7923084b877c19e46530819ef"} Apr 23 18:01:33.772430 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:33.772334 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa80070a-0a95-430b-b355-193722f7759c","Type":"ContainerStarted","Data":"50d3bcc9e54635c66e497a8dd3331c14fab9b7587942abf0e16e1baadbbee572"} Apr 23 18:01:33.814643 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:33.814583 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.516880852 podStartE2EDuration="7.814568073s" podCreationTimestamp="2026-04-23 18:01:26 +0000 UTC" firstStartedPulling="2026-04-23 18:01:27.084937326 +0000 UTC m=+168.551532112" lastFinishedPulling="2026-04-23 18:01:33.382624546 +0000 UTC m=+174.849219333" observedRunningTime="2026-04-23 18:01:33.812627169 +0000 UTC m=+175.279221990" watchObservedRunningTime="2026-04-23 18:01:33.814568073 +0000 UTC m=+175.281162934" Apr 23 18:01:35.171745 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:35.171711 2577 patch_prober.go:28] interesting pod/image-registry-7d58df7478-pcpd9 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 18:01:35.172147 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:35.171766 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" podUID="1f4e671d-2e11-4c0d-af89-5265ece89a36" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:01:36.679412 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:36.679382 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7d58df7478-pcpd9" Apr 23 18:01:36.944273 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:36.944179 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:01:37.783713 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:37.783677 2577 generic.go:358] "Generic (PLEG): container finished" podID="e09daf07-2394-49af-9049-d1adc1e44e79" containerID="439b66ebb024ab3fb5d9c2811c9ec5756da1c635fa807b06e10d50618e801167" exitCode=255 Apr 23 18:01:37.784130 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:37.783756 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll" event={"ID":"e09daf07-2394-49af-9049-d1adc1e44e79","Type":"ContainerDied","Data":"439b66ebb024ab3fb5d9c2811c9ec5756da1c635fa807b06e10d50618e801167"} Apr 23 18:01:37.784130 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:37.784083 2577 scope.go:117] "RemoveContainer" containerID="439b66ebb024ab3fb5d9c2811c9ec5756da1c635fa807b06e10d50618e801167" Apr 23 18:01:38.788216 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:38.788171 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-559668968f-jjjll" event={"ID":"e09daf07-2394-49af-9049-d1adc1e44e79","Type":"ContainerStarted","Data":"baad2e607a60385a5e86b3f3b02ba6e3ddd3022c301a1cd1df76ec8135545bc5"} Apr 23 18:01:39.758629 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:39.758591 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wksgm" Apr 23 18:01:47.909323 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:01:47.909288 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9l5gk_1c2b2631-b55c-4750-a693-fd0de3b3687f/serve-healthcheck-canary/0.log" Apr 23 18:02:26.945037 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:02:26.944991 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:02:26.964425 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:02:26.964402 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:02:27.935449 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:02:27.935424 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:02:51.071028 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:02:51.070942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs\") pod \"network-metrics-daemon-g6b5k\" (UID: \"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa\") " pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 18:02:51.073331 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:02:51.073311 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bffdfc3-a546-4e7a-b9fd-46395fbcfffa-metrics-certs\") pod \"network-metrics-daemon-g6b5k\" (UID: \"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa\") " pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 18:02:51.160941 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:02:51.160907 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7tv5t\"" Apr 23 18:02:51.169035 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:02:51.169005 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g6b5k" Apr 23 18:02:51.288903 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:02:51.288870 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g6b5k"] Apr 23 18:02:51.292626 ip-10-0-128-229 kubenswrapper[2577]: W0423 18:02:51.292597 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bffdfc3_a546_4e7a_b9fd_46395fbcfffa.slice/crio-150e7c5d356ea3ee1256bdf8bdd88c5d4d47042fb9ec78570ebd582028b596a4 WatchSource:0}: Error finding container 150e7c5d356ea3ee1256bdf8bdd88c5d4d47042fb9ec78570ebd582028b596a4: Status 404 returned error can't find the container with id 150e7c5d356ea3ee1256bdf8bdd88c5d4d47042fb9ec78570ebd582028b596a4 Apr 23 18:02:51.985485 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:02:51.985441 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g6b5k" event={"ID":"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa","Type":"ContainerStarted","Data":"150e7c5d356ea3ee1256bdf8bdd88c5d4d47042fb9ec78570ebd582028b596a4"} Apr 23 18:02:52.990583 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:02:52.990548 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g6b5k" event={"ID":"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa","Type":"ContainerStarted","Data":"d30a3dcb416d4071a957101ca0213e02ccdffac555018e947c240b172201ff06"} Apr 23 18:02:52.990583 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:02:52.990585 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g6b5k" event={"ID":"2bffdfc3-a546-4e7a-b9fd-46395fbcfffa","Type":"ContainerStarted","Data":"aa4222d2389f9f4fac95059456c2732fe8a4e060fb8f0d4d4b6652762d84454d"} Apr 23 18:02:53.008791 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:02:53.008736 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-g6b5k" podStartSLOduration=253.119070449 podStartE2EDuration="4m14.008719879s" podCreationTimestamp="2026-04-23 17:58:39 +0000 UTC" firstStartedPulling="2026-04-23 18:02:51.294369934 +0000 UTC m=+252.760964719" lastFinishedPulling="2026-04-23 18:02:52.184019361 +0000 UTC m=+253.650614149" observedRunningTime="2026-04-23 18:02:53.007400821 +0000 UTC m=+254.473995639" watchObservedRunningTime="2026-04-23 18:02:53.008719879 +0000 UTC m=+254.475314684" Apr 23 18:03:12.260696 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:12.260660 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-m2gr7"] Apr 23 18:03:12.262814 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:12.262793 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m2gr7" Apr 23 18:03:12.265301 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:12.265277 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 18:03:12.272295 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:12.272273 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m2gr7"] Apr 23 18:03:12.326092 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:12.326051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/add4dd13-3101-4adf-b234-786ae8119e28-kubelet-config\") pod \"global-pull-secret-syncer-m2gr7\" (UID: \"add4dd13-3101-4adf-b234-786ae8119e28\") " pod="kube-system/global-pull-secret-syncer-m2gr7" Apr 23 18:03:12.326282 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:12.326116 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/add4dd13-3101-4adf-b234-786ae8119e28-dbus\") pod \"global-pull-secret-syncer-m2gr7\" (UID: \"add4dd13-3101-4adf-b234-786ae8119e28\") " pod="kube-system/global-pull-secret-syncer-m2gr7" Apr 23 18:03:12.326282 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:12.326155 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/add4dd13-3101-4adf-b234-786ae8119e28-original-pull-secret\") pod \"global-pull-secret-syncer-m2gr7\" (UID: \"add4dd13-3101-4adf-b234-786ae8119e28\") " pod="kube-system/global-pull-secret-syncer-m2gr7" Apr 23 18:03:12.427357 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:12.427322 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/add4dd13-3101-4adf-b234-786ae8119e28-kubelet-config\") pod \"global-pull-secret-syncer-m2gr7\" (UID: \"add4dd13-3101-4adf-b234-786ae8119e28\") " pod="kube-system/global-pull-secret-syncer-m2gr7" Apr 23 18:03:12.427484 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:12.427380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/add4dd13-3101-4adf-b234-786ae8119e28-dbus\") pod \"global-pull-secret-syncer-m2gr7\" (UID: \"add4dd13-3101-4adf-b234-786ae8119e28\") " pod="kube-system/global-pull-secret-syncer-m2gr7" Apr 23 18:03:12.427484 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:12.427403 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/add4dd13-3101-4adf-b234-786ae8119e28-original-pull-secret\") pod \"global-pull-secret-syncer-m2gr7\" (UID: \"add4dd13-3101-4adf-b234-786ae8119e28\") " pod="kube-system/global-pull-secret-syncer-m2gr7" Apr 23 18:03:12.427484 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:12.427449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/add4dd13-3101-4adf-b234-786ae8119e28-kubelet-config\") pod \"global-pull-secret-syncer-m2gr7\" (UID: \"add4dd13-3101-4adf-b234-786ae8119e28\") " pod="kube-system/global-pull-secret-syncer-m2gr7" Apr 23 18:03:12.427620 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:12.427589 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/add4dd13-3101-4adf-b234-786ae8119e28-dbus\") pod \"global-pull-secret-syncer-m2gr7\" (UID: \"add4dd13-3101-4adf-b234-786ae8119e28\") " pod="kube-system/global-pull-secret-syncer-m2gr7" Apr 23 18:03:12.429850 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:12.429823 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/add4dd13-3101-4adf-b234-786ae8119e28-original-pull-secret\") pod \"global-pull-secret-syncer-m2gr7\" (UID: \"add4dd13-3101-4adf-b234-786ae8119e28\") " pod="kube-system/global-pull-secret-syncer-m2gr7" Apr 23 18:03:12.572042 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:12.572010 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m2gr7" Apr 23 18:03:12.693416 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:12.693383 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m2gr7"] Apr 23 18:03:12.696862 ip-10-0-128-229 kubenswrapper[2577]: W0423 18:03:12.696820 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadd4dd13_3101_4adf_b234_786ae8119e28.slice/crio-cbd2e5a0eb23f547b183a4c45318f992a66c8c0d0752dbbd4cc0435ee3b1a056 WatchSource:0}: Error finding container cbd2e5a0eb23f547b183a4c45318f992a66c8c0d0752dbbd4cc0435ee3b1a056: Status 404 returned error can't find the container with id cbd2e5a0eb23f547b183a4c45318f992a66c8c0d0752dbbd4cc0435ee3b1a056 Apr 23 18:03:13.053885 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:13.053855 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m2gr7" event={"ID":"add4dd13-3101-4adf-b234-786ae8119e28","Type":"ContainerStarted","Data":"cbd2e5a0eb23f547b183a4c45318f992a66c8c0d0752dbbd4cc0435ee3b1a056"} Apr 23 18:03:17.066281 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:17.066237 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m2gr7" event={"ID":"add4dd13-3101-4adf-b234-786ae8119e28","Type":"ContainerStarted","Data":"aa37dbe5145ec5bcd84c6023a7a2f672d87f040aa5b8673cb70f4e46d9a89996"} Apr 23 18:03:17.084867 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:17.084813 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-m2gr7" podStartSLOduration=1.687201036 podStartE2EDuration="5.084797702s" podCreationTimestamp="2026-04-23 18:03:12 +0000 UTC" firstStartedPulling="2026-04-23 18:03:12.698314148 +0000 UTC m=+274.164908937" lastFinishedPulling="2026-04-23 18:03:16.095910809 +0000 UTC m=+277.562505603" observedRunningTime="2026-04-23 18:03:17.083526477 +0000 UTC m=+278.550121285" watchObservedRunningTime="2026-04-23 18:03:17.084797702 +0000 UTC m=+278.551392509" Apr 23 18:03:39.031238 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:39.031209 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:03:39.032550 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:39.032510 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:03:39.036839 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:03:39.036822 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 18:08:39.050902 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:08:39.050874 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:08:39.051421 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:08:39.051405 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:13:39.069791 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:13:39.069717 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:13:39.070322 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:13:39.069882 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:18:39.091052 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:18:39.091022 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:18:39.092160 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:18:39.092138 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:23:39.109683 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:23:39.109653 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:23:39.111173 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:23:39.111147 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:28:39.128522 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:28:39.128489 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:28:39.131187 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:28:39.131166 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:33:39.147817 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:33:39.147784 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:33:39.150888 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:33:39.150860 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:38:39.166084 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:38:39.166054 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:38:39.172074 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:38:39.172050 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:43:39.185928 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:43:39.185833 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:43:39.191491 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:43:39.191470 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:48:39.205904 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:48:39.205773 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:48:39.214464 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:48:39.214443 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:53:39.224205 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:53:39.224093 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:53:39.232995 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:53:39.232973 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:58:39.242382 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:58:39.242267 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 18:58:39.250787 ip-10-0-128-229 kubenswrapper[2577]: I0423 18:58:39.250758 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 19:03:39.260046 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:03:39.259934 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 19:03:39.268577 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:03:39.268556 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 19:04:27.493632 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:27.493508 2577 ???:1] "http: TLS handshake error from 10.0.143.63:48744: EOF" Apr 23 19:04:27.504867 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:27.504840 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-m2gr7_add4dd13-3101-4adf-b234-786ae8119e28/global-pull-secret-syncer/0.log" Apr 23 19:04:27.666257 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:27.666229 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-bkbbs_2791436a-c956-4bbf-81a8-9cf1dff161c4/konnectivity-agent/0.log" Apr 23 19:04:27.737360 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:27.737327 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-229.ec2.internal_43b465d4e8bfce095d9d53677dbda72a/haproxy/0.log" Apr 23 19:04:31.341074 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:31.341039 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9vxzd_0f66e4e5-dba6-43a8-b089-e01f3cb09e8d/kube-state-metrics/0.log" Apr 23 19:04:31.366492 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:31.366464 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9vxzd_0f66e4e5-dba6-43a8-b089-e01f3cb09e8d/kube-rbac-proxy-main/0.log" Apr 23 19:04:31.390164 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:31.390137 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9vxzd_0f66e4e5-dba6-43a8-b089-e01f3cb09e8d/kube-rbac-proxy-self/0.log" Apr 23 19:04:31.641848 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:31.641767 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zf54l_8d941eca-ba33-4a26-b5bf-855b863a5e16/node-exporter/0.log" Apr 23 19:04:31.662482 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:31.662457 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zf54l_8d941eca-ba33-4a26-b5bf-855b863a5e16/kube-rbac-proxy/0.log" Apr 23 19:04:31.684942 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:31.684918 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zf54l_8d941eca-ba33-4a26-b5bf-855b863a5e16/init-textfile/0.log" Apr 23 19:04:31.812289 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:31.812263 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aa80070a-0a95-430b-b355-193722f7759c/prometheus/0.log" Apr 23 19:04:31.833522 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:31.833494 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aa80070a-0a95-430b-b355-193722f7759c/config-reloader/0.log" Apr 23 19:04:31.859480 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:31.859453 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aa80070a-0a95-430b-b355-193722f7759c/thanos-sidecar/0.log" Apr 23 19:04:31.882194 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:31.882164 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aa80070a-0a95-430b-b355-193722f7759c/kube-rbac-proxy-web/0.log" Apr 23 19:04:31.910595 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:31.910496 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aa80070a-0a95-430b-b355-193722f7759c/kube-rbac-proxy/0.log" Apr 23 19:04:31.941278 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:31.941253 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aa80070a-0a95-430b-b355-193722f7759c/kube-rbac-proxy-thanos/0.log" Apr 23 19:04:31.967121 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:31.967095 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aa80070a-0a95-430b-b355-193722f7759c/init-config-reloader/0.log" Apr 23 19:04:34.611209 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.611166 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5"] Apr 23 19:04:34.614697 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.614673 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.617198 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.617176 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-25xv6\"/\"kube-root-ca.crt\"" Apr 23 19:04:34.617331 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.617182 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-25xv6\"/\"default-dockercfg-mzws8\"" Apr 23 19:04:34.618333 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.618318 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-25xv6\"/\"openshift-service-ca.crt\"" Apr 23 19:04:34.626660 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.626632 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5"] Apr 23 19:04:34.707191 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.707158 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjdll\" (UniqueName: \"kubernetes.io/projected/0e96a358-9010-45f7-a7e0-9aa964b2e729-kube-api-access-zjdll\") pod \"perf-node-gather-daemonset-ndvh5\" (UID: \"0e96a358-9010-45f7-a7e0-9aa964b2e729\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.707382 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.707208 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e96a358-9010-45f7-a7e0-9aa964b2e729-lib-modules\") pod \"perf-node-gather-daemonset-ndvh5\" (UID: \"0e96a358-9010-45f7-a7e0-9aa964b2e729\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.707382 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.707232 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0e96a358-9010-45f7-a7e0-9aa964b2e729-podres\") pod \"perf-node-gather-daemonset-ndvh5\" (UID: \"0e96a358-9010-45f7-a7e0-9aa964b2e729\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.707382 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.707250 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e96a358-9010-45f7-a7e0-9aa964b2e729-sys\") pod \"perf-node-gather-daemonset-ndvh5\" (UID: \"0e96a358-9010-45f7-a7e0-9aa964b2e729\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.707524 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.707411 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0e96a358-9010-45f7-a7e0-9aa964b2e729-proc\") pod \"perf-node-gather-daemonset-ndvh5\" (UID: \"0e96a358-9010-45f7-a7e0-9aa964b2e729\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.808066 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.808033 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0e96a358-9010-45f7-a7e0-9aa964b2e729-podres\") pod \"perf-node-gather-daemonset-ndvh5\" (UID: \"0e96a358-9010-45f7-a7e0-9aa964b2e729\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.808066 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.808070 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e96a358-9010-45f7-a7e0-9aa964b2e729-sys\") pod \"perf-node-gather-daemonset-ndvh5\" (UID: \"0e96a358-9010-45f7-a7e0-9aa964b2e729\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.808293 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.808119 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0e96a358-9010-45f7-a7e0-9aa964b2e729-proc\") pod \"perf-node-gather-daemonset-ndvh5\" (UID: \"0e96a358-9010-45f7-a7e0-9aa964b2e729\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.808293 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.808142 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjdll\" (UniqueName: \"kubernetes.io/projected/0e96a358-9010-45f7-a7e0-9aa964b2e729-kube-api-access-zjdll\") pod \"perf-node-gather-daemonset-ndvh5\" (UID: \"0e96a358-9010-45f7-a7e0-9aa964b2e729\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.808293 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.808166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e96a358-9010-45f7-a7e0-9aa964b2e729-lib-modules\") pod \"perf-node-gather-daemonset-ndvh5\" (UID: \"0e96a358-9010-45f7-a7e0-9aa964b2e729\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.808293 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.808196 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0e96a358-9010-45f7-a7e0-9aa964b2e729-podres\") pod \"perf-node-gather-daemonset-ndvh5\" (UID: \"0e96a358-9010-45f7-a7e0-9aa964b2e729\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.808293 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.808239 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0e96a358-9010-45f7-a7e0-9aa964b2e729-proc\") pod \"perf-node-gather-daemonset-ndvh5\" (UID: \"0e96a358-9010-45f7-a7e0-9aa964b2e729\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.808293 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.808270 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e96a358-9010-45f7-a7e0-9aa964b2e729-lib-modules\") pod \"perf-node-gather-daemonset-ndvh5\" (UID: \"0e96a358-9010-45f7-a7e0-9aa964b2e729\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.808293 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.808267 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e96a358-9010-45f7-a7e0-9aa964b2e729-sys\") pod \"perf-node-gather-daemonset-ndvh5\" (UID: \"0e96a358-9010-45f7-a7e0-9aa964b2e729\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.816765 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.816724 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjdll\" (UniqueName: \"kubernetes.io/projected/0e96a358-9010-45f7-a7e0-9aa964b2e729-kube-api-access-zjdll\") pod \"perf-node-gather-daemonset-ndvh5\" (UID: \"0e96a358-9010-45f7-a7e0-9aa964b2e729\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:34.925554 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:34.925444 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:35.049646 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:35.049574 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5"] Apr 23 19:04:35.052244 ip-10-0-128-229 kubenswrapper[2577]: W0423 19:04:35.052211 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0e96a358_9010_45f7_a7e0_9aa964b2e729.slice/crio-7a6d0b77636affefa0bd353e19416d93db0b8f531f6e6c8ba1909ac048dfbea7 WatchSource:0}: Error finding container 7a6d0b77636affefa0bd353e19416d93db0b8f531f6e6c8ba1909ac048dfbea7: Status 404 returned error can't find the container with id 7a6d0b77636affefa0bd353e19416d93db0b8f531f6e6c8ba1909ac048dfbea7 Apr 23 19:04:35.053795 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:35.053779 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 19:04:35.592242 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:35.592214 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wksgm_c9a391d0-68cc-4801-b1b2-40e572f5a934/dns/0.log" Apr 23 19:04:35.617142 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:35.617106 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wksgm_c9a391d0-68cc-4801-b1b2-40e572f5a934/kube-rbac-proxy/0.log" Apr 23 19:04:35.641460 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:35.641434 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-799b7_39b5a3bb-ca15-4b0b-90d9-fb7ca1985f20/dns-node-resolver/0.log" Apr 23 19:04:35.910478 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:35.910381 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" event={"ID":"0e96a358-9010-45f7-a7e0-9aa964b2e729","Type":"ContainerStarted","Data":"36f3969d12ce6696ed61553a600009b3ec8e4ab1344df9abd4fe92c2dcda6a2c"} Apr 23 19:04:35.910478 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:35.910426 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" event={"ID":"0e96a358-9010-45f7-a7e0-9aa964b2e729","Type":"ContainerStarted","Data":"7a6d0b77636affefa0bd353e19416d93db0b8f531f6e6c8ba1909ac048dfbea7"} Apr 23 19:04:35.910698 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:35.910515 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:35.932693 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:35.932642 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" podStartSLOduration=1.932624744 podStartE2EDuration="1.932624744s" podCreationTimestamp="2026-04-23 19:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:04:35.930875435 +0000 UTC m=+3957.397470242" watchObservedRunningTime="2026-04-23 19:04:35.932624744 +0000 UTC m=+3957.399219552" Apr 23 19:04:36.131929 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:36.131869 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7d58df7478-pcpd9_1f4e671d-2e11-4c0d-af89-5265ece89a36/registry/0.log" Apr 23 19:04:36.156706 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:36.156670 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9z6qm_6296d6a8-88ab-4cde-8f0b-ca707b8e5b51/node-ca/0.log" Apr 23 19:04:37.331773 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:37.331741 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9l5gk_1c2b2631-b55c-4750-a693-fd0de3b3687f/serve-healthcheck-canary/0.log" Apr 23 19:04:37.866242 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:37.866217 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pbckf_7ce505f9-904d-44b6-95ed-28d7a1f4556b/kube-rbac-proxy/0.log" Apr 23 19:04:37.890311 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:37.890279 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pbckf_7ce505f9-904d-44b6-95ed-28d7a1f4556b/exporter/0.log" Apr 23 19:04:37.921077 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:37.921041 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pbckf_7ce505f9-904d-44b6-95ed-28d7a1f4556b/extractor/0.log" Apr 23 19:04:41.923704 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:41.923677 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-ndvh5" Apr 23 19:04:46.075839 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:46.075803 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7wxpl_a5556727-cb66-4227-ad85-c113c4a3cd70/kube-multus/0.log" Apr 23 19:04:46.305125 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:46.305090 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p7mrf_23fb900a-d97c-474d-b2b4-649024dc77d9/kube-multus-additional-cni-plugins/0.log" Apr 23 19:04:46.330597 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:46.330503 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p7mrf_23fb900a-d97c-474d-b2b4-649024dc77d9/egress-router-binary-copy/0.log" Apr 23 19:04:46.358981 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:46.358948 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p7mrf_23fb900a-d97c-474d-b2b4-649024dc77d9/cni-plugins/0.log" Apr 23 19:04:46.391375 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:46.391344 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p7mrf_23fb900a-d97c-474d-b2b4-649024dc77d9/bond-cni-plugin/0.log" Apr 23 19:04:46.419712 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:46.419679 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p7mrf_23fb900a-d97c-474d-b2b4-649024dc77d9/routeoverride-cni/0.log" Apr 23 19:04:46.445453 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:46.445421 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p7mrf_23fb900a-d97c-474d-b2b4-649024dc77d9/whereabouts-cni-bincopy/0.log" Apr 23 19:04:46.474586 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:46.474560 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p7mrf_23fb900a-d97c-474d-b2b4-649024dc77d9/whereabouts-cni/0.log" Apr 23 19:04:46.818417 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:46.818384 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-g6b5k_2bffdfc3-a546-4e7a-b9fd-46395fbcfffa/network-metrics-daemon/0.log" Apr 23 19:04:46.844295 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:46.844270 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-g6b5k_2bffdfc3-a546-4e7a-b9fd-46395fbcfffa/kube-rbac-proxy/0.log" Apr 23 19:04:48.733357 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:48.733322 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-controller/0.log" Apr 23 19:04:48.752438 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:48.752409 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/0.log" Apr 23 19:04:48.789160 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:48.789125 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovn-acl-logging/1.log" Apr 23 19:04:48.813281 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:48.813253 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/kube-rbac-proxy-node/0.log" Apr 23 19:04:48.837856 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:48.837821 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 19:04:48.855698 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:48.855671 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/northd/0.log" Apr 23 19:04:48.878866 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:48.878837 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/nbdb/0.log" Apr 23 19:04:48.902024 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:48.901986 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/sbdb/0.log" Apr 23 19:04:49.072080 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:49.072047 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfxr2_554e663c-f1ef-46e3-bfb8-dc66a756354c/ovnkube-controller/0.log" Apr 23 19:04:50.083824 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:50.083795 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-8c8cd_4b15bea0-c4e2-4c56-8a0b-1d0bf653c8ac/network-check-target-container/0.log" Apr 23 19:04:51.029304 ip-10-0-128-229 kubenswrapper[2577]: I0423 19:04:51.029278 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-nnpn7_f0d9878b-7280-4232-a0c9-247ed15ce7a8/iptables-alerter/0.log"