Apr 16 17:38:37.607238 ip-10-0-143-234 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 17:38:37.607253 ip-10-0-143-234 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 17:38:37.607263 ip-10-0-143-234 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 17:38:37.607575 ip-10-0-143-234 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 17:38:47.708078 ip-10-0-143-234 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 17:38:47.708100 ip-10-0-143-234 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 7395b9988e2b449da9e53fdfaa173b6e -- Apr 16 17:41:03.358638 ip-10-0-143-234 systemd[1]: Starting Kubernetes Kubelet... Apr 16 17:41:03.858525 ip-10-0-143-234 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:41:03.858525 ip-10-0-143-234 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 17:41:03.858525 ip-10-0-143-234 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:41:03.858525 ip-10-0-143-234 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 17:41:03.858525 ip-10-0-143-234 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:41:03.861119 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.861029 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 17:41:03.866354 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866326 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:41:03.866354 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866354 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866358 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866361 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866365 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866367 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866370 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866373 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866375 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866378 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866380 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866383 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866385 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866388 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866390 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866393 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866395 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866398 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866406 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866408 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866411 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:41:03.866425 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866413 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866415 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866418 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866421 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866423 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866428 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866432 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866437 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866441 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866445 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866447 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866449 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866452 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866454 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866457 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866460 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866462 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866464 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866467 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:41:03.866878 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866469 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866471 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866474 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866476 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866479 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866482 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866484 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866486 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866488 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866492 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866494 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866497 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866499 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866502 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866504 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866507 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866510 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866513 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866515 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866518 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:41:03.867389 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866520 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866523 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866525 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866527 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866530 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866532 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866536 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866538 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866540 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866543 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866545 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866547 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866550 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866552 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866555 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866557 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866560 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866562 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866565 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866567 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:41:03.867860 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866569 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866572 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866575 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866577 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866579 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866582 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866973 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866979 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866982 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866985 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866989 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866992 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866996 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.866998 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867001 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867004 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867006 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867008 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867011 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867014 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:41:03.868320 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867016 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867018 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867021 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867023 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867026 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867028 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867031 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867035 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867039 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867041 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867044 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867046 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867049 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867051 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867054 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867056 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867058 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867061 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867063 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:41:03.868796 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867067 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867070 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867072 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867075 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867077 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867079 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867082 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867084 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867086 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867089 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867091 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867093 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867096 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867099 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867102 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867104 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867106 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867108 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867111 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867113 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:41:03.869242 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867116 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867118 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867121 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867123 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867126 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867129 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867131 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867133 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867136 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867139 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867141 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867143 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867146 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867149 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867152 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867154 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867157 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867159 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867162 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:41:03.869743 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867164 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867167 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867169 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867171 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867174 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867176 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867179 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867181 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867183 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867186 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867188 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867191 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867193 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.867195 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.867992 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868000 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868008 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868013 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868017 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868020 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868024 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 17:41:03.870241 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868029 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868032 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868035 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868038 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868042 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868045 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868049 2577 flags.go:64] FLAG: --cgroup-root="" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868051 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868054 2577 flags.go:64] FLAG: --client-ca-file="" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868057 2577 flags.go:64] FLAG: --cloud-config="" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868060 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868062 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868066 2577 flags.go:64] FLAG: --cluster-domain="" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868069 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868072 2577 flags.go:64] FLAG: --config-dir="" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868074 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868078 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868082 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868084 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868087 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868091 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868094 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868097 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868100 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868103 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 17:41:03.870752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868106 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868111 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868114 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868116 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868119 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868122 2577 flags.go:64] FLAG: --enable-server="true" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868125 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868130 2577 flags.go:64] FLAG: --event-burst="100" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868133 2577 flags.go:64] FLAG: --event-qps="50" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868135 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868138 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868142 2577 flags.go:64] FLAG: --eviction-hard="" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868145 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868148 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868151 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868154 2577 flags.go:64] FLAG: --eviction-soft="" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868157 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868159 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868162 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868165 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868168 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868170 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868173 2577 flags.go:64] FLAG: --feature-gates="" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868177 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868180 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 17:41:03.871458 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868183 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868185 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868189 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868192 2577 flags.go:64] FLAG: --help="false" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868195 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-143-234.ec2.internal" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868197 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868200 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868203 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868206 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868210 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868213 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868216 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868218 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868221 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868224 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868227 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868229 2577 flags.go:64] FLAG: --kube-reserved="" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868232 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868235 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868238 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868241 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868243 2577 flags.go:64] FLAG: --lock-file="" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868246 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868249 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 17:41:03.872032 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868252 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868257 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868260 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868262 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868265 2577 flags.go:64] FLAG: --logging-format="text" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868268 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868271 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868274 2577 flags.go:64] FLAG: --manifest-url="" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868277 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868281 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868287 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868292 2577 flags.go:64] FLAG: --max-pods="110" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868295 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868298 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868301 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868303 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868306 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868309 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868312 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868321 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868324 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868342 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868346 2577 flags.go:64] FLAG: --pod-cidr="" Apr 16 17:41:03.872588 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868349 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868354 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868357 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868361 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868363 2577 flags.go:64] FLAG: --port="10250" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868366 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868369 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04859bd661f0d5ddc" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868372 2577 flags.go:64] FLAG: --qos-reserved="" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868376 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868378 2577 flags.go:64] FLAG: --register-node="true" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868381 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868384 2577 flags.go:64] FLAG: --register-with-taints="" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868387 2577 flags.go:64] FLAG: --registry-burst="10" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868390 2577 flags.go:64] FLAG: --registry-qps="5" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868393 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868396 2577 flags.go:64] FLAG: --reserved-memory="" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868399 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868402 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868405 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868410 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868413 2577 flags.go:64] FLAG: --runonce="false" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868415 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868418 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868421 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868424 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868427 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 17:41:03.873113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868430 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868433 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868437 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868440 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868443 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868446 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868449 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868451 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868454 2577 flags.go:64] FLAG: --system-cgroups="" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868457 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868462 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868465 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868467 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868471 2577 flags.go:64] FLAG: --tls-min-version="" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868474 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868477 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868479 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868482 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868485 2577 flags.go:64] FLAG: --v="2" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868489 2577 flags.go:64] FLAG: --version="false" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868493 2577 flags.go:64] FLAG: --vmodule="" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868497 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.868500 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868620 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868624 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:41:03.873730 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868629 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868632 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868634 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868637 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868639 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868642 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868645 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868648 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868650 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868653 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868656 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868659 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868661 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868664 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868666 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868669 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868671 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868674 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868676 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868678 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:41:03.874311 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868681 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868683 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868686 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868688 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868691 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868693 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868695 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868698 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868700 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868703 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868705 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868707 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868711 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868713 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868716 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868721 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868723 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868725 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868728 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868730 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:41:03.874842 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868733 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868735 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868737 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868740 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868743 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868746 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868748 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868750 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868753 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868755 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868758 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868760 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868762 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868764 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868767 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868769 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868771 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868775 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868777 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868780 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:41:03.875345 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868782 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868785 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868788 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868790 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868794 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868797 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868800 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868803 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868806 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868808 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868811 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868814 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868816 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868819 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868821 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868823 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868826 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868829 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868831 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:41:03.875837 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868835 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:41:03.876291 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868838 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:41:03.876291 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868841 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:41:03.876291 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868845 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:41:03.876291 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.868848 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:41:03.876291 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.869746 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:41:03.876764 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.876742 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 17:41:03.876795 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.876765 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 17:41:03.876824 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876812 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:41:03.876824 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876819 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:41:03.876824 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876822 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:41:03.876824 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876825 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876829 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876831 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876834 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876837 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876840 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876843 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876845 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876848 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876851 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876853 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876856 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876859 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876861 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876864 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876866 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876868 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876871 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876873 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876876 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:41:03.876921 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876879 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876881 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876883 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876886 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876888 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876891 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876893 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876896 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876899 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876901 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876904 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876906 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876908 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876911 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876914 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876917 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876920 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876922 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876924 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876927 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:41:03.877410 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876929 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876932 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876934 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876936 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876939 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876941 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876944 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876946 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876949 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876951 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876954 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876956 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876958 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876961 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876963 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876965 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876967 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876971 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876976 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:41:03.877883 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876978 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876981 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876984 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876986 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876988 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876991 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876993 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.876997 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877000 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877003 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877005 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877007 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877010 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877013 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877016 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877019 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877021 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877024 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877026 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:41:03.878355 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877028 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:41:03.878798 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877031 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:41:03.878798 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877033 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:41:03.878798 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877035 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:41:03.878798 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877038 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:41:03.878798 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.877043 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:41:03.878798 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877139 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:41:03.878798 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877144 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:41:03.878798 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877147 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:41:03.878798 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877150 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:41:03.878798 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877153 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:41:03.878798 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877155 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:41:03.878798 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877158 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:41:03.878798 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877160 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:41:03.878798 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877163 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:41:03.878798 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877166 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877168 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877171 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877173 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877175 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877178 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877181 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877184 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877187 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877192 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877194 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877197 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877200 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877202 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877205 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877207 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877209 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877212 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877214 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:41:03.879203 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877217 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877220 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877223 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877226 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877228 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877231 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877233 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877236 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877238 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877241 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877243 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877246 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877248 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877251 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877253 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877256 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877258 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877261 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877264 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877266 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:41:03.879654 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877269 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877271 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877274 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877276 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877278 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877280 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877283 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877285 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877287 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877290 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877292 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877294 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877297 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877299 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877302 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877304 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877306 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877309 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877311 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:41:03.880109 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877313 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877316 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877318 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877320 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877323 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877325 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877343 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877346 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877348 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877351 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877354 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877356 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877358 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877361 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877364 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877366 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877368 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877371 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:41:03.880570 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:03.877373 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:41:03.880986 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.877378 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:41:03.880986 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.878246 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 17:41:03.882324 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.882309 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 17:41:03.883430 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.883419 2577 server.go:1019] "Starting client certificate rotation" Apr 16 17:41:03.883586 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.883567 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 17:41:03.883631 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.883612 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 17:41:03.912099 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.912061 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 17:41:03.914861 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.914833 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 17:41:03.927904 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.927876 2577 log.go:25] "Validated CRI v1 runtime API" Apr 16 17:41:03.934083 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.934067 2577 log.go:25] "Validated CRI v1 image API" Apr 16 17:41:03.937825 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.937803 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 17:41:03.940319 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.940298 2577 fs.go:135] Filesystem UUIDs: map[444139f7-3e9d-4377-a786-b23966725029:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 b018c556-fbbb-4f57-a341-cdd8e1ee1758:/dev/nvme0n1p4] Apr 16 17:41:03.940391 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.940319 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 17:41:03.945541 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.945521 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 17:41:03.946278 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.946145 2577 manager.go:217] Machine: {Timestamp:2026-04-16 17:41:03.94407389 +0000 UTC m=+0.452464043 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3199573 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25213dd76cd46fe99296da4812203b SystemUUID:ec25213d-d76c-d46f-e992-96da4812203b BootID:7395b998-8e2b-449d-a9e5-3fdfaa173b6e Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6f:8e:42:a3:df Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6f:8e:42:a3:df Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fa:ee:40:cd:10:90 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 17:41:03.946278 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.946273 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 17:41:03.946411 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.946374 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 17:41:03.946758 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.946738 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 17:41:03.946909 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.946759 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-234.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 17:41:03.946950 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.946919 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 17:41:03.946950 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.946928 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 17:41:03.946950 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.946941 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 17:41:03.948584 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.948572 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 17:41:03.950522 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.950512 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 17:41:03.950632 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.950623 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 17:41:03.953259 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.953250 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 16 17:41:03.953294 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.953263 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 17:41:03.953294 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.953278 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 17:41:03.953294 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.953291 2577 kubelet.go:397] "Adding apiserver pod source" Apr 16 17:41:03.953444 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.953299 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 17:41:03.954583 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.954567 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 17:41:03.954583 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.954586 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 17:41:03.958267 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.958248 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 17:41:03.960775 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.960762 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 17:41:03.962416 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.962399 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 17:41:03.962470 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.962425 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 17:41:03.962470 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.962434 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 17:41:03.962470 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.962441 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 17:41:03.962470 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.962448 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 17:41:03.962470 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.962453 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 17:41:03.962470 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.962460 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 17:41:03.962470 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.962465 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 17:41:03.962735 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.962482 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 17:41:03.962735 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.962490 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 17:41:03.962735 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.962506 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 17:41:03.962735 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.962514 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 17:41:03.963523 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.963511 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 17:41:03.963523 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.963522 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 17:41:03.970500 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.970475 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 17:41:03.970830 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.970814 2577 server.go:1295] "Started kubelet" Apr 16 17:41:03.970932 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.970882 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 17:41:03.971040 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.971006 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 17:41:03.971090 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.971061 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 17:41:03.971763 ip-10-0-143-234 systemd[1]: Started Kubernetes Kubelet. Apr 16 17:41:03.973027 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.973011 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-234.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 17:41:03.973113 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:03.973007 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 17:41:03.973390 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.973375 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 17:41:03.973666 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:03.973650 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-234.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 17:41:03.974144 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.974127 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 16 17:41:03.980007 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.979987 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 17:41:03.980007 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.980000 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 17:41:03.980282 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:03.980242 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 17:41:03.980806 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.980785 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 17:41:03.980879 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.980811 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 17:41:03.980947 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.980934 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 17:41:03.981012 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.980940 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 16 17:41:03.981098 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.981085 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 16 17:41:03.981258 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:03.981230 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-234.ec2.internal\" not found" Apr 16 17:41:03.982178 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.982161 2577 factory.go:153] Registering CRI-O factory Apr 16 17:41:03.982264 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.982181 2577 factory.go:223] Registration of the crio container factory successfully Apr 16 17:41:03.982264 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.982249 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 17:41:03.982264 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.982264 2577 factory.go:55] Registering systemd factory Apr 16 17:41:03.982428 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.982272 2577 factory.go:223] Registration of the systemd container factory successfully Apr 16 17:41:03.982428 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.982294 2577 factory.go:103] Registering Raw factory Apr 16 17:41:03.982428 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.982309 2577 manager.go:1196] Started watching for new ooms in manager Apr 16 17:41:03.982745 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.982724 2577 manager.go:319] Starting recovery of all containers Apr 16 17:41:03.985402 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:03.985372 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-234.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 17:41:03.985499 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:03.985451 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 17:41:03.986688 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:03.985540 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-234.ec2.internal.18a6e72873b9b9f2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-234.ec2.internal,UID:ip-10-0-143-234.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-234.ec2.internal,},FirstTimestamp:2026-04-16 17:41:03.970499058 +0000 UTC m=+0.478889220,LastTimestamp:2026-04-16 17:41:03.970499058 +0000 UTC m=+0.478889220,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-234.ec2.internal,}" Apr 16 17:41:03.986839 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.986817 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-l8cfn" Apr 16 17:41:03.993693 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.993569 2577 manager.go:324] Recovery completed Apr 16 17:41:03.994801 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.994773 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-l8cfn" Apr 16 17:41:03.999534 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:03.999522 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:41:04.002820 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.002805 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:41:04.002890 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.002834 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:41:04.002890 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.002843 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:41:04.003407 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.003393 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 17:41:04.003407 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.003406 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 17:41:04.003512 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.003433 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 17:41:04.004909 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.004834 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-234.ec2.internal.18a6e72875a6eb7b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-234.ec2.internal,UID:ip-10-0-143-234.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-234.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-234.ec2.internal,},FirstTimestamp:2026-04-16 17:41:04.002820987 +0000 UTC m=+0.511211143,LastTimestamp:2026-04-16 17:41:04.002820987 +0000 UTC m=+0.511211143,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-234.ec2.internal,}" Apr 16 17:41:04.007494 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.007481 2577 policy_none.go:49] "None policy: Start" Apr 16 17:41:04.007560 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.007498 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 17:41:04.007560 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.007510 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 16 17:41:04.047951 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.047932 2577 manager.go:341] "Starting Device Plugin manager" Apr 16 17:41:04.067703 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.047981 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 17:41:04.067703 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.047996 2577 server.go:85] "Starting device plugin registration server" Apr 16 17:41:04.067703 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.048274 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 17:41:04.067703 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.048287 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 17:41:04.067703 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.048443 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 17:41:04.067703 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.048576 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 17:41:04.067703 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.048590 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 17:41:04.067703 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.049072 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 17:41:04.067703 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.049114 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-234.ec2.internal\" not found" Apr 16 17:41:04.115668 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.115576 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 17:41:04.116885 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.116868 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 17:41:04.116955 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.116900 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 17:41:04.116955 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.116924 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 17:41:04.116955 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.116932 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 17:41:04.117093 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.116971 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 17:41:04.119757 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.119739 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:41:04.149269 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.149232 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:41:04.150198 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.150166 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:41:04.150198 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.150199 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:41:04.150351 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.150209 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:41:04.150351 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.150239 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.161733 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.161716 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.161793 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.161744 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-234.ec2.internal\": node \"ip-10-0-143-234.ec2.internal\" not found" Apr 16 17:41:04.181430 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.181405 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-234.ec2.internal\" not found" Apr 16 17:41:04.217259 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.217226 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-234.ec2.internal"] Apr 16 17:41:04.217353 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.217308 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:41:04.218301 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.218285 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:41:04.218397 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.218318 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:41:04.218397 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.218350 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:41:04.220698 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.220682 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:41:04.220835 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.220822 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.220879 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.220851 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:41:04.221427 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.221412 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:41:04.221427 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.221417 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:41:04.221528 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.221444 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:41:04.221528 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.221455 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:41:04.221528 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.221444 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:41:04.221629 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.221540 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:41:04.224088 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.224071 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.224174 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.224112 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:41:04.224862 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.224840 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:41:04.224967 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.224868 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:41:04.224967 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.224881 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:41:04.251194 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.251167 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-234.ec2.internal\" not found" node="ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.255663 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.255645 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-234.ec2.internal\" not found" node="ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.282006 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.281977 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-234.ec2.internal\" not found" Apr 16 17:41:04.283141 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.283123 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5e4336b3d29f4e58db03e6558b9b42fd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal\" (UID: \"5e4336b3d29f4e58db03e6558b9b42fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.283203 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.283150 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e4336b3d29f4e58db03e6558b9b42fd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal\" (UID: \"5e4336b3d29f4e58db03e6558b9b42fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.283203 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.283170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ed94319a7f3740b078962730ca47007-config\") pod \"kube-apiserver-proxy-ip-10-0-143-234.ec2.internal\" (UID: \"0ed94319a7f3740b078962730ca47007\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.382236 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.382133 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-234.ec2.internal\" not found" Apr 16 17:41:04.384339 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.384313 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e4336b3d29f4e58db03e6558b9b42fd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal\" (UID: \"5e4336b3d29f4e58db03e6558b9b42fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.384387 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.384359 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ed94319a7f3740b078962730ca47007-config\") pod \"kube-apiserver-proxy-ip-10-0-143-234.ec2.internal\" (UID: \"0ed94319a7f3740b078962730ca47007\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.384387 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.384374 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5e4336b3d29f4e58db03e6558b9b42fd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal\" (UID: \"5e4336b3d29f4e58db03e6558b9b42fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.384459 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.384414 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5e4336b3d29f4e58db03e6558b9b42fd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal\" (UID: \"5e4336b3d29f4e58db03e6558b9b42fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.384459 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.384415 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e4336b3d29f4e58db03e6558b9b42fd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal\" (UID: \"5e4336b3d29f4e58db03e6558b9b42fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.384459 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.384420 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ed94319a7f3740b078962730ca47007-config\") pod \"kube-apiserver-proxy-ip-10-0-143-234.ec2.internal\" (UID: \"0ed94319a7f3740b078962730ca47007\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.482979 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.482933 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-234.ec2.internal\" not found" Apr 16 17:41:04.554306 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.554275 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.558091 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.558072 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-234.ec2.internal" Apr 16 17:41:04.583293 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.583259 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-234.ec2.internal\" not found" Apr 16 17:41:04.683931 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.683848 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-234.ec2.internal\" not found" Apr 16 17:41:04.784409 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.784375 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-234.ec2.internal\" not found" Apr 16 17:41:04.884023 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.883987 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 17:41:04.884643 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.884144 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 17:41:04.885076 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.885056 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-234.ec2.internal\" not found" Apr 16 17:41:04.980395 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.980368 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 17:41:04.985404 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:04.985377 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-234.ec2.internal\" not found" Apr 16 17:41:04.994596 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.994565 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 17:41:04.998013 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.997988 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 17:36:03 +0000 UTC" deadline="2027-12-14 23:48:56.696983709 +0000 UTC" Apr 16 17:41:04.998013 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:04.998011 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14574h7m51.698975461s" Apr 16 17:41:05.012575 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.012544 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-f9scl" Apr 16 17:41:05.020308 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.020286 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-f9scl" Apr 16 17:41:05.061631 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:05.061589 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e4336b3d29f4e58db03e6558b9b42fd.slice/crio-094280863b5509489de333833e6f24296928d4d5aa5ab04d8d07d7edb9f8e328 WatchSource:0}: Error finding container 094280863b5509489de333833e6f24296928d4d5aa5ab04d8d07d7edb9f8e328: Status 404 returned error can't find the container with id 094280863b5509489de333833e6f24296928d4d5aa5ab04d8d07d7edb9f8e328 Apr 16 17:41:05.066306 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.066290 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:41:05.086435 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:05.086403 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-234.ec2.internal\" not found" Apr 16 17:41:05.120084 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:05.120046 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ed94319a7f3740b078962730ca47007.slice/crio-39d2fc55eb06dcf72f06a69abfa84ee88f335e7269f9151695637f1e2e5743fb WatchSource:0}: Error finding container 39d2fc55eb06dcf72f06a69abfa84ee88f335e7269f9151695637f1e2e5743fb: Status 404 returned error can't find the container with id 39d2fc55eb06dcf72f06a69abfa84ee88f335e7269f9151695637f1e2e5743fb Apr 16 17:41:05.120565 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.120506 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal" event={"ID":"5e4336b3d29f4e58db03e6558b9b42fd","Type":"ContainerStarted","Data":"094280863b5509489de333833e6f24296928d4d5aa5ab04d8d07d7edb9f8e328"} Apr 16 17:41:05.187195 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:05.187158 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-234.ec2.internal\" not found" Apr 16 17:41:05.259550 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.259466 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:41:05.288274 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:05.288231 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-234.ec2.internal\" not found" Apr 16 17:41:05.308272 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.308251 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:41:05.380425 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.380390 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-234.ec2.internal" Apr 16 17:41:05.389430 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.389322 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 17:41:05.390385 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.390366 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal" Apr 16 17:41:05.401189 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.401165 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 17:41:05.444365 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.444322 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:41:05.953856 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.953825 2577 apiserver.go:52] "Watching apiserver" Apr 16 17:41:05.961958 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.961933 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 17:41:05.962512 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.962485 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-s72ln","openshift-multus/multus-q4vbk","openshift-multus/network-metrics-daemon-gg8gs","openshift-network-operator/iptables-alerter-2krk7","kube-system/konnectivity-agent-2xwww","kube-system/kube-apiserver-proxy-ip-10-0-143-234.ec2.internal","openshift-dns/node-resolver-vrkbq","openshift-network-diagnostics/network-check-target-vwz6h","openshift-ovn-kubernetes/ovnkube-node-g5src","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z","openshift-cluster-node-tuning-operator/tuned-dfv6z","openshift-image-registry/node-ca-pv5jg","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal"] Apr 16 17:41:05.967354 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.967308 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.969649 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.969626 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 17:41:05.969765 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.969661 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 17:41:05.969831 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.969800 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mjst5\"" Apr 16 17:41:05.969876 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.969838 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:05.969924 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.969912 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 17:41:05.970002 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:05.969906 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:05.970192 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.970177 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 17:41:05.972801 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.972115 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:05.974295 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.974244 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2krk7" Apr 16 17:41:05.974968 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.974949 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 17:41:05.974968 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.974959 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 17:41:05.975109 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.975009 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 17:41:05.975109 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.975026 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4m88m\"" Apr 16 17:41:05.976287 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.976234 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:41:05.976287 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.976258 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 17:41:05.976507 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.976495 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 17:41:05.976692 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.976672 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2xwww" Apr 16 17:41:05.976905 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.976798 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:05.976905 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:05.976862 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:05.977087 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.977073 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-q5gxj\"" Apr 16 17:41:05.978613 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.978470 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 17:41:05.978828 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.978808 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-c4mbn\"" Apr 16 17:41:05.978927 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.978828 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 17:41:05.979761 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.979741 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.981835 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.981814 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 17:41:05.981921 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.981861 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kkmhp\"" Apr 16 17:41:05.982183 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.982162 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 17:41:05.982183 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.982178 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 17:41:05.982351 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.982206 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 17:41:05.982351 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.982206 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 17:41:05.982351 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.982186 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 17:41:05.982887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.982870 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:05.984803 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.984782 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 17:41:05.984892 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.984827 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 17:41:05.984952 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.984779 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dwljk\"" Apr 16 17:41:05.985745 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.985434 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:05.987438 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.987423 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:41:05.987523 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.987511 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 17:41:05.987691 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.987676 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lhz44\"" Apr 16 17:41:05.987984 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.987967 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pv5jg" Apr 16 17:41:05.990743 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.990040 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 17:41:05.990743 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.990060 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 17:41:05.990743 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.990294 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 17:41:05.990743 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.990406 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-q52x8\"" Apr 16 17:41:05.991252 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991231 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vrkbq" Apr 16 17:41:05.991459 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991422 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-run-openvswitch\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.991551 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991472 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-run-ovn-kubernetes\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.991551 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991507 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-registration-dir\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:05.991551 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-etc-selinux\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:05.991712 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991560 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-node-log\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.991712 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-env-overrides\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.991712 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991618 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-multus-socket-dir-parent\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.991712 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-var-lib-kubelet\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.991712 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991657 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3d913013-d26e-4756-9b14-4e6907f4baf0-agent-certs\") pod \"konnectivity-agent-2xwww\" (UID: \"3d913013-d26e-4756-9b14-4e6907f4baf0\") " pod="kube-system/konnectivity-agent-2xwww" Apr 16 17:41:05.991712 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991671 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-var-lib-openvswitch\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.991999 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991709 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-ovn-node-metrics-cert\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.991999 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991760 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxh2c\" (UniqueName: \"kubernetes.io/projected/6ed2c2e6-5851-4969-afa8-f8336c09ee54-kube-api-access-pxh2c\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.991999 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991790 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxtd5\" (UniqueName: \"kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5\") pod \"network-check-target-vwz6h\" (UID: \"2436cc07-66d7-4793-9260-5c3585aae363\") " pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:05.991999 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991814 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-cni-netd\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.991999 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991850 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-run-multus-certs\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.991999 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991880 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:05.991999 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991904 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-cnibin\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:05.991999 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991928 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-os-release\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:05.991999 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991951 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/24061e65-3c69-48d6-8110-9c66fb64e102-iptables-alerter-script\") pod \"iptables-alerter-2krk7\" (UID: \"24061e65-3c69-48d6-8110-9c66fb64e102\") " pod="openshift-network-operator/iptables-alerter-2krk7" Apr 16 17:41:05.991999 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991966 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-systemd-units\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.991999 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.991983 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-multus-cni-dir\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.992381 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-cnibin\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.992381 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992042 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24061e65-3c69-48d6-8110-9c66fb64e102-host-slash\") pod \"iptables-alerter-2krk7\" (UID: \"24061e65-3c69-48d6-8110-9c66fb64e102\") " pod="openshift-network-operator/iptables-alerter-2krk7" Apr 16 17:41:05.992381 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992063 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-run-systemd\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.992381 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992108 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-hostroot\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.992381 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992149 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-cni-bin\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.992381 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992191 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-ovnkube-config\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.992381 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992221 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:05.992381 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992263 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kph6\" (UniqueName: \"kubernetes.io/projected/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-kube-api-access-9kph6\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:05.992381 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992287 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t76pc\" (UniqueName: \"kubernetes.io/projected/24061e65-3c69-48d6-8110-9c66fb64e102-kube-api-access-t76pc\") pod \"iptables-alerter-2krk7\" (UID: \"24061e65-3c69-48d6-8110-9c66fb64e102\") " pod="openshift-network-operator/iptables-alerter-2krk7" Apr 16 17:41:05.992381 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992310 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:05.992381 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992350 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-system-cni-dir\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:05.992381 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992383 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-multus-conf-dir\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.992864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992408 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-etc-openvswitch\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.992864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992444 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxzbb\" (UniqueName: \"kubernetes.io/projected/b278f078-c804-48cf-b77a-2509deb41cc2-kube-api-access-fxzbb\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:05.992864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992467 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6ed2c2e6-5851-4969-afa8-f8336c09ee54-cni-binary-copy\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.992864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992490 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-run-netns\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.992864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992527 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3d913013-d26e-4756-9b14-4e6907f4baf0-konnectivity-ca\") pod \"konnectivity-agent-2xwww\" (UID: \"3d913013-d26e-4756-9b14-4e6907f4baf0\") " pod="kube-system/konnectivity-agent-2xwww" Apr 16 17:41:05.992864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992569 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs\") pod \"network-metrics-daemon-gg8gs\" (UID: \"eccdd8a8-ee59-4c3c-852e-f012ce698554\") " pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:05.992864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-var-lib-cni-multus\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.992864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-run-netns\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.992864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992662 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-log-socket\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.992864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992719 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.992864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992744 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vkcr\" (UniqueName: \"kubernetes.io/projected/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-kube-api-access-4vkcr\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.992864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992765 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jbr9\" (UniqueName: \"kubernetes.io/projected/eccdd8a8-ee59-4c3c-852e-f012ce698554-kube-api-access-5jbr9\") pod \"network-metrics-daemon-gg8gs\" (UID: \"eccdd8a8-ee59-4c3c-852e-f012ce698554\") " pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:05.992864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992787 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-system-cni-dir\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.992864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992838 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-run-k8s-cni-cncf-io\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.992864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992862 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-kubelet\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.993478 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992890 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-ovnkube-script-lib\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.993478 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992920 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-sys-fs\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:05.993478 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992972 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 17:41:05.993478 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.993003 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 17:41:05.993478 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.992980 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-socket-dir\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:05.993478 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.993070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-os-release\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.993478 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.993095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-var-lib-cni-bin\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.993478 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.993121 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-run-ovn\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.993478 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.993143 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:05.993478 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.993171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-cni-binary-copy\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:05.993478 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.993196 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6ed2c2e6-5851-4969-afa8-f8336c09ee54-multus-daemon-config\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.993478 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.993220 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-slash\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:05.996394 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.996372 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-device-dir\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:05.996471 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.996432 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-etc-kubernetes\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:05.996579 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.996564 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qmr25\"" Apr 16 17:41:05.999849 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:05.999559 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:41:06.020985 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.020890 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:36:05 +0000 UTC" deadline="2027-11-24 12:17:48.974928993 +0000 UTC" Apr 16 17:41:06.020985 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.020918 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14082h36m42.954014341s" Apr 16 17:41:06.081687 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.081660 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 17:41:06.097167 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-systemd-units\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.097326 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097202 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfcvn\" (UniqueName: \"kubernetes.io/projected/f144295e-123d-49ad-96f0-a793fc10f2bd-kube-api-access-cfcvn\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.097326 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097221 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-systemd-units\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.097326 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097231 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hhkf\" (UniqueName: \"kubernetes.io/projected/2e01f328-7d13-47e4-ba26-d47919ca94fb-kube-api-access-6hhkf\") pod \"node-resolver-vrkbq\" (UID: \"2e01f328-7d13-47e4-ba26-d47919ca94fb\") " pod="openshift-dns/node-resolver-vrkbq" Apr 16 17:41:06.097326 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-multus-cni-dir\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.097536 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-cnibin\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.097536 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097424 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-cnibin\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.097536 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097442 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24061e65-3c69-48d6-8110-9c66fb64e102-host-slash\") pod \"iptables-alerter-2krk7\" (UID: \"24061e65-3c69-48d6-8110-9c66fb64e102\") " pod="openshift-network-operator/iptables-alerter-2krk7" Apr 16 17:41:06.097536 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097465 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-multus-cni-dir\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.097536 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097475 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-run-systemd\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.097536 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097477 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24061e65-3c69-48d6-8110-9c66fb64e102-host-slash\") pod \"iptables-alerter-2krk7\" (UID: \"24061e65-3c69-48d6-8110-9c66fb64e102\") " pod="openshift-network-operator/iptables-alerter-2krk7" Apr 16 17:41:06.097536 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8643560d-c751-40a2-a84e-fd9619f0a198-serviceca\") pod \"node-ca-pv5jg\" (UID: \"8643560d-c751-40a2-a84e-fd9619f0a198\") " pod="openshift-image-registry/node-ca-pv5jg" Apr 16 17:41:06.097754 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097550 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99bsj\" (UniqueName: \"kubernetes.io/projected/8643560d-c751-40a2-a84e-fd9619f0a198-kube-api-access-99bsj\") pod \"node-ca-pv5jg\" (UID: \"8643560d-c751-40a2-a84e-fd9619f0a198\") " pod="openshift-image-registry/node-ca-pv5jg" Apr 16 17:41:06.097754 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097579 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-hostroot\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.097754 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097597 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-cni-bin\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.097754 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097596 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-run-systemd\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.097754 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097613 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-ovnkube-config\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.097754 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097642 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:06.097754 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097671 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-modprobe-d\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.097754 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097697 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-sysctl-d\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.097754 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097739 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-systemd\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.098005 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097762 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-tuned\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.098005 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kph6\" (UniqueName: \"kubernetes.io/projected/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-kube-api-access-9kph6\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.098005 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097820 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t76pc\" (UniqueName: \"kubernetes.io/projected/24061e65-3c69-48d6-8110-9c66fb64e102-kube-api-access-t76pc\") pod \"iptables-alerter-2krk7\" (UID: \"24061e65-3c69-48d6-8110-9c66fb64e102\") " pod="openshift-network-operator/iptables-alerter-2krk7" Apr 16 17:41:06.098005 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097843 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8643560d-c751-40a2-a84e-fd9619f0a198-host\") pod \"node-ca-pv5jg\" (UID: \"8643560d-c751-40a2-a84e-fd9619f0a198\") " pod="openshift-image-registry/node-ca-pv5jg" Apr 16 17:41:06.098005 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097885 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.098005 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097892 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-hostroot\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.098005 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097910 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-system-cni-dir\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.098005 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097929 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-multus-conf-dir\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.098005 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097961 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-system-cni-dir\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.098005 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.097972 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:06.098276 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098115 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-ovnkube-config\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.098276 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098192 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-multus-conf-dir\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.098276 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-cni-bin\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.098276 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-etc-openvswitch\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.098402 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxzbb\" (UniqueName: \"kubernetes.io/projected/b278f078-c804-48cf-b77a-2509deb41cc2-kube-api-access-fxzbb\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:06.098402 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098299 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-host\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.098402 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098325 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.098402 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098372 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6ed2c2e6-5851-4969-afa8-f8336c09ee54-cni-binary-copy\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.098402 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098387 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-etc-openvswitch\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.098402 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098388 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-run-netns\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.098558 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3d913013-d26e-4756-9b14-4e6907f4baf0-konnectivity-ca\") pod \"konnectivity-agent-2xwww\" (UID: \"3d913013-d26e-4756-9b14-4e6907f4baf0\") " pod="kube-system/konnectivity-agent-2xwww" Apr 16 17:41:06.098558 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098418 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-run-netns\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.098610 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098558 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e01f328-7d13-47e4-ba26-d47919ca94fb-tmp-dir\") pod \"node-resolver-vrkbq\" (UID: \"2e01f328-7d13-47e4-ba26-d47919ca94fb\") " pod="openshift-dns/node-resolver-vrkbq" Apr 16 17:41:06.098610 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs\") pod \"network-metrics-daemon-gg8gs\" (UID: \"eccdd8a8-ee59-4c3c-852e-f012ce698554\") " pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:06.098610 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098597 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-var-lib-cni-multus\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.098691 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-run-netns\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.098691 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098655 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-log-socket\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.098747 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098684 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.098747 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098714 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vkcr\" (UniqueName: \"kubernetes.io/projected/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-kube-api-access-4vkcr\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.098806 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098751 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jbr9\" (UniqueName: \"kubernetes.io/projected/eccdd8a8-ee59-4c3c-852e-f012ce698554-kube-api-access-5jbr9\") pod \"network-metrics-daemon-gg8gs\" (UID: \"eccdd8a8-ee59-4c3c-852e-f012ce698554\") " pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:06.098806 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098759 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3d913013-d26e-4756-9b14-4e6907f4baf0-konnectivity-ca\") pod \"konnectivity-agent-2xwww\" (UID: \"3d913013-d26e-4756-9b14-4e6907f4baf0\") " pod="kube-system/konnectivity-agent-2xwww" Apr 16 17:41:06.098806 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-system-cni-dir\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.098806 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-run-k8s-cni-cncf-io\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.098806 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098791 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6ed2c2e6-5851-4969-afa8-f8336c09ee54-cni-binary-copy\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.098806 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098798 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-run-netns\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.098806 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098803 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-kubelet\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.099046 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098818 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-ovnkube-script-lib\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.099046 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098947 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-run-k8s-cni-cncf-io\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.099046 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098957 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-kubelet\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.099046 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.098983 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-sys-fs\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:06.099046 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099000 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-var-lib-cni-multus\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.099046 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099014 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-sysconfig\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.099046 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:06.099039 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:06.099046 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099042 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-log-socket\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-var-lib-kubelet\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-socket-dir\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099127 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-os-release\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099124 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-system-cni-dir\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099144 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-sys-fs\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:06.099168 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs podName:eccdd8a8-ee59-4c3c-852e-f012ce698554 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:06.599146247 +0000 UTC m=+3.107536387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs") pod "network-metrics-daemon-gg8gs" (UID: "eccdd8a8-ee59-4c3c-852e-f012ce698554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-var-lib-cni-bin\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099204 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-os-release\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099205 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-run-ovn\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099238 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-run-ovn\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099250 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-var-lib-cni-bin\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099259 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2e01f328-7d13-47e4-ba26-d47919ca94fb-hosts-file\") pod \"node-resolver-vrkbq\" (UID: \"2e01f328-7d13-47e4-ba26-d47919ca94fb\") " pod="openshift-dns/node-resolver-vrkbq" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099288 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-socket-dir\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099295 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099323 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-cni-binary-copy\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.099406 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099371 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6ed2c2e6-5851-4969-afa8-f8336c09ee54-multus-daemon-config\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099440 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-slash\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-device-dir\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099488 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-kubernetes\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099511 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-etc-kubernetes\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099535 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-run-openvswitch\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099557 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-run-ovn-kubernetes\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099581 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-registration-dir\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099607 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-etc-selinux\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099632 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-sys\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099656 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-node-log\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099678 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-env-overrides\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099703 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-run\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099703 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-ovnkube-script-lib\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099728 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-multus-socket-dir-parent\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099757 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-var-lib-kubelet\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099763 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-run-ovn-kubernetes\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.100160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099782 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3d913013-d26e-4756-9b14-4e6907f4baf0-agent-certs\") pod \"konnectivity-agent-2xwww\" (UID: \"3d913013-d26e-4756-9b14-4e6907f4baf0\") " pod="kube-system/konnectivity-agent-2xwww" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099795 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-cni-binary-copy\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099803 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-slash\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-var-lib-openvswitch\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099832 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6ed2c2e6-5851-4969-afa8-f8336c09ee54-multus-daemon-config\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099845 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-ovn-node-metrics-cert\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099850 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-device-dir\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099889 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-run-openvswitch\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099893 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-var-lib-kubelet\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099922 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-etc-kubernetes\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099931 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-registration-dir\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099939 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-node-log\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxh2c\" (UniqueName: \"kubernetes.io/projected/6ed2c2e6-5851-4969-afa8-f8336c09ee54-kube-api-access-pxh2c\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.099994 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxtd5\" (UniqueName: \"kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5\") pod \"network-check-target-vwz6h\" (UID: \"2436cc07-66d7-4793-9260-5c3585aae363\") " pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100017 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-cni-netd\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100057 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100067 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b278f078-c804-48cf-b77a-2509deb41cc2-etc-selinux\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:06.100887 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-sysctl-conf\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100106 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-lib-modules\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100145 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f144295e-123d-49ad-96f0-a793fc10f2bd-tmp\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-run-multus-certs\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100229 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100252 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-cnibin\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100267 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-os-release\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/24061e65-3c69-48d6-8110-9c66fb64e102-iptables-alerter-script\") pod \"iptables-alerter-2krk7\" (UID: \"24061e65-3c69-48d6-8110-9c66fb64e102\") " pod="openshift-network-operator/iptables-alerter-2krk7" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100281 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-env-overrides\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100513 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-host-run-multus-certs\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100645 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6ed2c2e6-5851-4969-afa8-f8336c09ee54-multus-socket-dir-parent\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100712 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-cnibin\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100737 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-host-cni-netd\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100836 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-os-release\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.100912 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-var-lib-openvswitch\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.101294 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/24061e65-3c69-48d6-8110-9c66fb64e102-iptables-alerter-script\") pod \"iptables-alerter-2krk7\" (UID: \"24061e65-3c69-48d6-8110-9c66fb64e102\") " pod="openshift-network-operator/iptables-alerter-2krk7" Apr 16 17:41:06.101635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.101370 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.103509 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.103489 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-ovn-node-metrics-cert\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.103684 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.103665 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3d913013-d26e-4756-9b14-4e6907f4baf0-agent-certs\") pod \"konnectivity-agent-2xwww\" (UID: \"3d913013-d26e-4756-9b14-4e6907f4baf0\") " pod="kube-system/konnectivity-agent-2xwww" Apr 16 17:41:06.114599 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:06.114045 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:41:06.114599 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:06.114072 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:41:06.114599 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:06.114102 2577 projected.go:194] Error preparing data for projected volume kube-api-access-cxtd5 for pod openshift-network-diagnostics/network-check-target-vwz6h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:06.114599 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:06.114493 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5 podName:2436cc07-66d7-4793-9260-5c3585aae363 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:06.614473121 +0000 UTC m=+3.122863279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cxtd5" (UniqueName: "kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5") pod "network-check-target-vwz6h" (UID: "2436cc07-66d7-4793-9260-5c3585aae363") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:06.118210 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.118173 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t76pc\" (UniqueName: \"kubernetes.io/projected/24061e65-3c69-48d6-8110-9c66fb64e102-kube-api-access-t76pc\") pod \"iptables-alerter-2krk7\" (UID: \"24061e65-3c69-48d6-8110-9c66fb64e102\") " pod="openshift-network-operator/iptables-alerter-2krk7" Apr 16 17:41:06.118784 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.118762 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxzbb\" (UniqueName: \"kubernetes.io/projected/b278f078-c804-48cf-b77a-2509deb41cc2-kube-api-access-fxzbb\") pod \"aws-ebs-csi-driver-node-qbj6z\" (UID: \"b278f078-c804-48cf-b77a-2509deb41cc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:06.119909 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.119886 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vkcr\" (UniqueName: \"kubernetes.io/projected/2a931acd-9936-4d4e-a3b6-d2d86cb92da4-kube-api-access-4vkcr\") pod \"ovnkube-node-g5src\" (UID: \"2a931acd-9936-4d4e-a3b6-d2d86cb92da4\") " pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.120014 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.119891 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxh2c\" (UniqueName: \"kubernetes.io/projected/6ed2c2e6-5851-4969-afa8-f8336c09ee54-kube-api-access-pxh2c\") pod \"multus-q4vbk\" (UID: \"6ed2c2e6-5851-4969-afa8-f8336c09ee54\") " pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.120014 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.119902 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kph6\" (UniqueName: \"kubernetes.io/projected/f243dd2e-6d7f-4c1b-9ec7-346a02c79bba-kube-api-access-9kph6\") pod \"multus-additional-cni-plugins-s72ln\" (UID: \"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba\") " pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.121636 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.121612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jbr9\" (UniqueName: \"kubernetes.io/projected/eccdd8a8-ee59-4c3c-852e-f012ce698554-kube-api-access-5jbr9\") pod \"network-metrics-daemon-gg8gs\" (UID: \"eccdd8a8-ee59-4c3c-852e-f012ce698554\") " pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:06.123098 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.123069 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-234.ec2.internal" event={"ID":"0ed94319a7f3740b078962730ca47007","Type":"ContainerStarted","Data":"39d2fc55eb06dcf72f06a69abfa84ee88f335e7269f9151695637f1e2e5743fb"} Apr 16 17:41:06.201581 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8643560d-c751-40a2-a84e-fd9619f0a198-serviceca\") pod \"node-ca-pv5jg\" (UID: \"8643560d-c751-40a2-a84e-fd9619f0a198\") " pod="openshift-image-registry/node-ca-pv5jg" Apr 16 17:41:06.201746 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201594 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99bsj\" (UniqueName: \"kubernetes.io/projected/8643560d-c751-40a2-a84e-fd9619f0a198-kube-api-access-99bsj\") pod \"node-ca-pv5jg\" (UID: \"8643560d-c751-40a2-a84e-fd9619f0a198\") " pod="openshift-image-registry/node-ca-pv5jg" Apr 16 17:41:06.201746 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-modprobe-d\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.201746 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201648 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-sysctl-d\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.201746 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201670 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-systemd\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.201746 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201693 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-tuned\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.201746 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8643560d-c751-40a2-a84e-fd9619f0a198-host\") pod \"node-ca-pv5jg\" (UID: \"8643560d-c751-40a2-a84e-fd9619f0a198\") " pod="openshift-image-registry/node-ca-pv5jg" Apr 16 17:41:06.201746 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201742 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-host\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202038 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201768 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e01f328-7d13-47e4-ba26-d47919ca94fb-tmp-dir\") pod \"node-resolver-vrkbq\" (UID: \"2e01f328-7d13-47e4-ba26-d47919ca94fb\") " pod="openshift-dns/node-resolver-vrkbq" Apr 16 17:41:06.202038 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201769 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-systemd\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202038 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201819 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-sysctl-d\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202038 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-sysconfig\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202038 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201860 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-var-lib-kubelet\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202038 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201890 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2e01f328-7d13-47e4-ba26-d47919ca94fb-hosts-file\") pod \"node-resolver-vrkbq\" (UID: \"2e01f328-7d13-47e4-ba26-d47919ca94fb\") " pod="openshift-dns/node-resolver-vrkbq" Apr 16 17:41:06.202038 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201892 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-modprobe-d\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202038 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201932 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-host\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202038 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201947 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-sysconfig\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202038 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201970 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-kubernetes\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202038 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201988 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2e01f328-7d13-47e4-ba26-d47919ca94fb-hosts-file\") pod \"node-resolver-vrkbq\" (UID: \"2e01f328-7d13-47e4-ba26-d47919ca94fb\") " pod="openshift-dns/node-resolver-vrkbq" Apr 16 17:41:06.202038 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.201979 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-var-lib-kubelet\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202038 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.202040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-kubernetes\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.202059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e01f328-7d13-47e4-ba26-d47919ca94fb-tmp-dir\") pod \"node-resolver-vrkbq\" (UID: \"2e01f328-7d13-47e4-ba26-d47919ca94fb\") " pod="openshift-dns/node-resolver-vrkbq" Apr 16 17:41:06.202709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.202074 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8643560d-c751-40a2-a84e-fd9619f0a198-host\") pod \"node-ca-pv5jg\" (UID: \"8643560d-c751-40a2-a84e-fd9619f0a198\") " pod="openshift-image-registry/node-ca-pv5jg" Apr 16 17:41:06.202709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.202098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-sys\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.202126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-run\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.202173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-sysctl-conf\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.202191 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-sys\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.202195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-lib-modules\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.202206 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-run\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.202219 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f144295e-123d-49ad-96f0-a793fc10f2bd-tmp\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.202277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfcvn\" (UniqueName: \"kubernetes.io/projected/f144295e-123d-49ad-96f0-a793fc10f2bd-kube-api-access-cfcvn\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.202304 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hhkf\" (UniqueName: \"kubernetes.io/projected/2e01f328-7d13-47e4-ba26-d47919ca94fb-kube-api-access-6hhkf\") pod \"node-resolver-vrkbq\" (UID: \"2e01f328-7d13-47e4-ba26-d47919ca94fb\") " pod="openshift-dns/node-resolver-vrkbq" Apr 16 17:41:06.202709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.202371 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-sysctl-conf\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.202379 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f144295e-123d-49ad-96f0-a793fc10f2bd-lib-modules\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.202709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.202557 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8643560d-c751-40a2-a84e-fd9619f0a198-serviceca\") pod \"node-ca-pv5jg\" (UID: \"8643560d-c751-40a2-a84e-fd9619f0a198\") " pod="openshift-image-registry/node-ca-pv5jg" Apr 16 17:41:06.204323 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.204270 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f144295e-123d-49ad-96f0-a793fc10f2bd-etc-tuned\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.205095 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.205079 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f144295e-123d-49ad-96f0-a793fc10f2bd-tmp\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.209445 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.209424 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfcvn\" (UniqueName: \"kubernetes.io/projected/f144295e-123d-49ad-96f0-a793fc10f2bd-kube-api-access-cfcvn\") pod \"tuned-dfv6z\" (UID: \"f144295e-123d-49ad-96f0-a793fc10f2bd\") " pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.209947 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.209927 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hhkf\" (UniqueName: \"kubernetes.io/projected/2e01f328-7d13-47e4-ba26-d47919ca94fb-kube-api-access-6hhkf\") pod \"node-resolver-vrkbq\" (UID: \"2e01f328-7d13-47e4-ba26-d47919ca94fb\") " pod="openshift-dns/node-resolver-vrkbq" Apr 16 17:41:06.210162 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.210142 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99bsj\" (UniqueName: \"kubernetes.io/projected/8643560d-c751-40a2-a84e-fd9619f0a198-kube-api-access-99bsj\") pod \"node-ca-pv5jg\" (UID: \"8643560d-c751-40a2-a84e-fd9619f0a198\") " pod="openshift-image-registry/node-ca-pv5jg" Apr 16 17:41:06.279247 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.279203 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q4vbk" Apr 16 17:41:06.287307 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.287283 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" Apr 16 17:41:06.299051 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.299025 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2krk7" Apr 16 17:41:06.304364 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.304342 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2xwww" Apr 16 17:41:06.311993 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.311969 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:06.317609 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.317592 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s72ln" Apr 16 17:41:06.324234 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.324215 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" Apr 16 17:41:06.330815 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.330796 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pv5jg" Apr 16 17:41:06.336393 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.336374 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vrkbq" Apr 16 17:41:06.604536 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.604440 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs\") pod \"network-metrics-daemon-gg8gs\" (UID: \"eccdd8a8-ee59-4c3c-852e-f012ce698554\") " pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:06.604690 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:06.604578 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:06.604690 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:06.604638 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs podName:eccdd8a8-ee59-4c3c-852e-f012ce698554 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:07.604623961 +0000 UTC m=+4.113014101 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs") pod "network-metrics-daemon-gg8gs" (UID: "eccdd8a8-ee59-4c3c-852e-f012ce698554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:06.688009 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:06.687966 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e01f328_7d13_47e4_ba26_d47919ca94fb.slice/crio-e0bae3e60850b076638df0a03a0814ec1e907e7970742d2ef5102f4716157030 WatchSource:0}: Error finding container e0bae3e60850b076638df0a03a0814ec1e907e7970742d2ef5102f4716157030: Status 404 returned error can't find the container with id e0bae3e60850b076638df0a03a0814ec1e907e7970742d2ef5102f4716157030 Apr 16 17:41:06.688926 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:06.688870 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d913013_d26e_4756_9b14_4e6907f4baf0.slice/crio-b2d366fae6ab420cff45d0865eeefdd27eb0fc77213301acf0f6a3fe74713990 WatchSource:0}: Error finding container b2d366fae6ab420cff45d0865eeefdd27eb0fc77213301acf0f6a3fe74713990: Status 404 returned error can't find the container with id b2d366fae6ab420cff45d0865eeefdd27eb0fc77213301acf0f6a3fe74713990 Apr 16 17:41:06.691892 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:06.690878 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ed2c2e6_5851_4969_afa8_f8336c09ee54.slice/crio-4872996a506bff2007a2379192171150fa2667fa58abc39c7556067fa7ce8be5 WatchSource:0}: Error finding container 4872996a506bff2007a2379192171150fa2667fa58abc39c7556067fa7ce8be5: Status 404 returned error can't find the container with id 4872996a506bff2007a2379192171150fa2667fa58abc39c7556067fa7ce8be5 Apr 16 17:41:06.693143 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:06.692993 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb278f078_c804_48cf_b77a_2509deb41cc2.slice/crio-7bdc22158d07ca2acf7d0a2f25105e97b4644c6dc47dc0c710a3f65922079643 WatchSource:0}: Error finding container 7bdc22158d07ca2acf7d0a2f25105e97b4644c6dc47dc0c710a3f65922079643: Status 404 returned error can't find the container with id 7bdc22158d07ca2acf7d0a2f25105e97b4644c6dc47dc0c710a3f65922079643 Apr 16 17:41:06.694707 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:06.694608 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf144295e_123d_49ad_96f0_a793fc10f2bd.slice/crio-965153a200367613f632167b49a0d29205ec9c754b039512695e464003633520 WatchSource:0}: Error finding container 965153a200367613f632167b49a0d29205ec9c754b039512695e464003633520: Status 404 returned error can't find the container with id 965153a200367613f632167b49a0d29205ec9c754b039512695e464003633520 Apr 16 17:41:06.695710 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:06.695689 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a931acd_9936_4d4e_a3b6_d2d86cb92da4.slice/crio-f200a1b8ab4b82e158530aff50b331f2a6879aa9d05a4296d3a5784871b278e9 WatchSource:0}: Error finding container f200a1b8ab4b82e158530aff50b331f2a6879aa9d05a4296d3a5784871b278e9: Status 404 returned error can't find the container with id f200a1b8ab4b82e158530aff50b331f2a6879aa9d05a4296d3a5784871b278e9 Apr 16 17:41:06.696151 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:06.696132 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8643560d_c751_40a2_a84e_fd9619f0a198.slice/crio-4874829b34ca08464182ac59457fae4effb8cce6d90cebec4e42e55e9c94bc54 WatchSource:0}: Error finding container 4874829b34ca08464182ac59457fae4effb8cce6d90cebec4e42e55e9c94bc54: Status 404 returned error can't find the container with id 4874829b34ca08464182ac59457fae4effb8cce6d90cebec4e42e55e9c94bc54 Apr 16 17:41:06.705542 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:06.705517 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxtd5\" (UniqueName: \"kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5\") pod \"network-check-target-vwz6h\" (UID: \"2436cc07-66d7-4793-9260-5c3585aae363\") " pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:06.705674 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:06.705659 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:41:06.705736 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:06.705681 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:41:06.705736 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:06.705693 2577 projected.go:194] Error preparing data for projected volume kube-api-access-cxtd5 for pod openshift-network-diagnostics/network-check-target-vwz6h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:06.705806 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:06.705747 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5 podName:2436cc07-66d7-4793-9260-5c3585aae363 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:07.70572955 +0000 UTC m=+4.214119693 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxtd5" (UniqueName: "kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5") pod "network-check-target-vwz6h" (UID: "2436cc07-66d7-4793-9260-5c3585aae363") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:06.718248 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:06.718214 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf243dd2e_6d7f_4c1b_9ec7_346a02c79bba.slice/crio-ed38cf1e5629b1173b3c5971807006f4c324b643d36ef67318431909915ca2cf WatchSource:0}: Error finding container ed38cf1e5629b1173b3c5971807006f4c324b643d36ef67318431909915ca2cf: Status 404 returned error can't find the container with id ed38cf1e5629b1173b3c5971807006f4c324b643d36ef67318431909915ca2cf Apr 16 17:41:06.718972 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:06.718948 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24061e65_3c69_48d6_8110_9c66fb64e102.slice/crio-3c7195faf7b398e518c145c274894502227b5d2425da3557412784e87bffab6b WatchSource:0}: Error finding container 3c7195faf7b398e518c145c274894502227b5d2425da3557412784e87bffab6b: Status 404 returned error can't find the container with id 3c7195faf7b398e518c145c274894502227b5d2425da3557412784e87bffab6b Apr 16 17:41:07.022162 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:07.021856 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:36:05 +0000 UTC" deadline="2027-12-26 09:02:26.423335442 +0000 UTC" Apr 16 17:41:07.022162 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:07.022088 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14847h21m19.401252473s" Apr 16 17:41:07.133983 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:07.133722 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q4vbk" event={"ID":"6ed2c2e6-5851-4969-afa8-f8336c09ee54","Type":"ContainerStarted","Data":"4872996a506bff2007a2379192171150fa2667fa58abc39c7556067fa7ce8be5"} Apr 16 17:41:07.138356 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:07.138132 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-234.ec2.internal" event={"ID":"0ed94319a7f3740b078962730ca47007","Type":"ContainerStarted","Data":"305bebbbf7529b2944a61669f8cb8e153dde34ea01b96e885aed7565fc99a83d"} Apr 16 17:41:07.139788 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:07.139765 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2krk7" event={"ID":"24061e65-3c69-48d6-8110-9c66fb64e102","Type":"ContainerStarted","Data":"3c7195faf7b398e518c145c274894502227b5d2425da3557412784e87bffab6b"} Apr 16 17:41:07.142157 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:07.142108 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s72ln" event={"ID":"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba","Type":"ContainerStarted","Data":"ed38cf1e5629b1173b3c5971807006f4c324b643d36ef67318431909915ca2cf"} Apr 16 17:41:07.144104 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:07.144079 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pv5jg" event={"ID":"8643560d-c751-40a2-a84e-fd9619f0a198","Type":"ContainerStarted","Data":"4874829b34ca08464182ac59457fae4effb8cce6d90cebec4e42e55e9c94bc54"} Apr 16 17:41:07.147444 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:07.147398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2xwww" event={"ID":"3d913013-d26e-4756-9b14-4e6907f4baf0","Type":"ContainerStarted","Data":"b2d366fae6ab420cff45d0865eeefdd27eb0fc77213301acf0f6a3fe74713990"} Apr 16 17:41:07.162427 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:07.162370 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vrkbq" event={"ID":"2e01f328-7d13-47e4-ba26-d47919ca94fb","Type":"ContainerStarted","Data":"e0bae3e60850b076638df0a03a0814ec1e907e7970742d2ef5102f4716157030"} Apr 16 17:41:07.166761 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:07.166698 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" event={"ID":"2a931acd-9936-4d4e-a3b6-d2d86cb92da4","Type":"ContainerStarted","Data":"f200a1b8ab4b82e158530aff50b331f2a6879aa9d05a4296d3a5784871b278e9"} Apr 16 17:41:07.173631 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:07.173566 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" event={"ID":"f144295e-123d-49ad-96f0-a793fc10f2bd","Type":"ContainerStarted","Data":"965153a200367613f632167b49a0d29205ec9c754b039512695e464003633520"} Apr 16 17:41:07.182051 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:07.181998 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" event={"ID":"b278f078-c804-48cf-b77a-2509deb41cc2","Type":"ContainerStarted","Data":"7bdc22158d07ca2acf7d0a2f25105e97b4644c6dc47dc0c710a3f65922079643"} Apr 16 17:41:07.614010 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:07.613404 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs\") pod \"network-metrics-daemon-gg8gs\" (UID: \"eccdd8a8-ee59-4c3c-852e-f012ce698554\") " pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:07.614010 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:07.613594 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:07.614010 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:07.613655 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs podName:eccdd8a8-ee59-4c3c-852e-f012ce698554 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:09.613637473 +0000 UTC m=+6.122027619 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs") pod "network-metrics-daemon-gg8gs" (UID: "eccdd8a8-ee59-4c3c-852e-f012ce698554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:07.714641 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:07.714595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxtd5\" (UniqueName: \"kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5\") pod \"network-check-target-vwz6h\" (UID: \"2436cc07-66d7-4793-9260-5c3585aae363\") " pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:07.714820 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:07.714770 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:41:07.714820 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:07.714786 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:41:07.714820 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:07.714798 2577 projected.go:194] Error preparing data for projected volume kube-api-access-cxtd5 for pod openshift-network-diagnostics/network-check-target-vwz6h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:07.714962 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:07.714857 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5 podName:2436cc07-66d7-4793-9260-5c3585aae363 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:09.714839186 +0000 UTC m=+6.223229327 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxtd5" (UniqueName: "kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5") pod "network-check-target-vwz6h" (UID: "2436cc07-66d7-4793-9260-5c3585aae363") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:08.122360 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:08.122234 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:08.123300 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:08.122944 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:08.123300 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:08.123086 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:08.123300 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:08.123217 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:08.191826 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:08.191611 2577 generic.go:358] "Generic (PLEG): container finished" podID="5e4336b3d29f4e58db03e6558b9b42fd" containerID="6481cfd5c9c35c345e9a754f9d57574eff10e83a5825048fb03ab7177f5fb685" exitCode=0 Apr 16 17:41:08.191826 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:08.191700 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal" event={"ID":"5e4336b3d29f4e58db03e6558b9b42fd","Type":"ContainerDied","Data":"6481cfd5c9c35c345e9a754f9d57574eff10e83a5825048fb03ab7177f5fb685"} Apr 16 17:41:08.208905 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:08.208821 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-234.ec2.internal" podStartSLOduration=3.208801598 podStartE2EDuration="3.208801598s" podCreationTimestamp="2026-04-16 17:41:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:41:07.150646567 +0000 UTC m=+3.659036730" watchObservedRunningTime="2026-04-16 17:41:08.208801598 +0000 UTC m=+4.717191761" Apr 16 17:41:09.197542 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:09.197064 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal" event={"ID":"5e4336b3d29f4e58db03e6558b9b42fd","Type":"ContainerStarted","Data":"1990edede11b55c644766a69b731eacc971a3812262aea9861e2fb62e6e4f14e"} Apr 16 17:41:09.629763 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:09.629670 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs\") pod \"network-metrics-daemon-gg8gs\" (UID: \"eccdd8a8-ee59-4c3c-852e-f012ce698554\") " pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:09.629940 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:09.629817 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:09.629940 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:09.629877 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs podName:eccdd8a8-ee59-4c3c-852e-f012ce698554 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:13.629858949 +0000 UTC m=+10.138249106 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs") pod "network-metrics-daemon-gg8gs" (UID: "eccdd8a8-ee59-4c3c-852e-f012ce698554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:09.730488 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:09.730447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxtd5\" (UniqueName: \"kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5\") pod \"network-check-target-vwz6h\" (UID: \"2436cc07-66d7-4793-9260-5c3585aae363\") " pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:09.730703 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:09.730660 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:41:09.730703 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:09.730683 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:41:09.730703 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:09.730700 2577 projected.go:194] Error preparing data for projected volume kube-api-access-cxtd5 for pod openshift-network-diagnostics/network-check-target-vwz6h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:09.730832 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:09.730761 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5 podName:2436cc07-66d7-4793-9260-5c3585aae363 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:13.730742249 +0000 UTC m=+10.239132413 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxtd5" (UniqueName: "kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5") pod "network-check-target-vwz6h" (UID: "2436cc07-66d7-4793-9260-5c3585aae363") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:10.122011 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:10.121507 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:10.122011 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:10.121518 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:10.122011 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:10.121642 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:10.122011 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:10.121916 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:12.120478 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:12.120453 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:12.120922 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:12.120545 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:12.120922 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:12.120888 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:12.121033 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:12.120988 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:13.666428 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:13.666389 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs\") pod \"network-metrics-daemon-gg8gs\" (UID: \"eccdd8a8-ee59-4c3c-852e-f012ce698554\") " pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:13.666896 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:13.666501 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:13.666896 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:13.666582 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs podName:eccdd8a8-ee59-4c3c-852e-f012ce698554 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:21.666562756 +0000 UTC m=+18.174952901 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs") pod "network-metrics-daemon-gg8gs" (UID: "eccdd8a8-ee59-4c3c-852e-f012ce698554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:13.767238 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:13.767199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxtd5\" (UniqueName: \"kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5\") pod \"network-check-target-vwz6h\" (UID: \"2436cc07-66d7-4793-9260-5c3585aae363\") " pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:13.767459 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:13.767434 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:41:13.767529 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:13.767462 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:41:13.767529 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:13.767475 2577 projected.go:194] Error preparing data for projected volume kube-api-access-cxtd5 for pod openshift-network-diagnostics/network-check-target-vwz6h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:13.767726 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:13.767543 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5 podName:2436cc07-66d7-4793-9260-5c3585aae363 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:21.767521252 +0000 UTC m=+18.275911406 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxtd5" (UniqueName: "kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5") pod "network-check-target-vwz6h" (UID: "2436cc07-66d7-4793-9260-5c3585aae363") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:14.118775 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:14.118687 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:14.118925 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:14.118814 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:14.118925 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:14.118862 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:14.119041 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:14.118992 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:16.118106 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:16.118011 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:16.118602 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:16.118011 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:16.118602 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:16.118146 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:16.118602 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:16.118259 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:18.118049 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:18.118007 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:18.118604 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:18.118065 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:18.118604 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:18.118138 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:18.118604 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:18.118244 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:19.217790 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:19.217741 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-234.ec2.internal" podStartSLOduration=14.217726077 podStartE2EDuration="14.217726077s" podCreationTimestamp="2026-04-16 17:41:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:41:09.209750871 +0000 UTC m=+5.718141034" watchObservedRunningTime="2026-04-16 17:41:19.217726077 +0000 UTC m=+15.726116238" Apr 16 17:41:19.218250 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:19.218019 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7t8dz"] Apr 16 17:41:19.222510 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:19.222480 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:19.222634 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:19.222553 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7t8dz" podUID="d66127b0-6df7-4368-bf73-d0b830421d6c" Apr 16 17:41:19.306783 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:19.306753 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d66127b0-6df7-4368-bf73-d0b830421d6c-kubelet-config\") pod \"global-pull-secret-syncer-7t8dz\" (UID: \"d66127b0-6df7-4368-bf73-d0b830421d6c\") " pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:19.306954 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:19.306794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret\") pod \"global-pull-secret-syncer-7t8dz\" (UID: \"d66127b0-6df7-4368-bf73-d0b830421d6c\") " pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:19.306954 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:19.306870 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d66127b0-6df7-4368-bf73-d0b830421d6c-dbus\") pod \"global-pull-secret-syncer-7t8dz\" (UID: \"d66127b0-6df7-4368-bf73-d0b830421d6c\") " pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:19.408255 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:19.408159 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d66127b0-6df7-4368-bf73-d0b830421d6c-kubelet-config\") pod \"global-pull-secret-syncer-7t8dz\" (UID: \"d66127b0-6df7-4368-bf73-d0b830421d6c\") " pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:19.408255 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:19.408211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret\") pod \"global-pull-secret-syncer-7t8dz\" (UID: \"d66127b0-6df7-4368-bf73-d0b830421d6c\") " pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:19.408475 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:19.408256 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d66127b0-6df7-4368-bf73-d0b830421d6c-dbus\") pod \"global-pull-secret-syncer-7t8dz\" (UID: \"d66127b0-6df7-4368-bf73-d0b830421d6c\") " pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:19.408475 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:19.408293 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d66127b0-6df7-4368-bf73-d0b830421d6c-kubelet-config\") pod \"global-pull-secret-syncer-7t8dz\" (UID: \"d66127b0-6df7-4368-bf73-d0b830421d6c\") " pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:19.408475 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:19.408395 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:41:19.408475 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:19.408434 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d66127b0-6df7-4368-bf73-d0b830421d6c-dbus\") pod \"global-pull-secret-syncer-7t8dz\" (UID: \"d66127b0-6df7-4368-bf73-d0b830421d6c\") " pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:19.408475 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:19.408457 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret podName:d66127b0-6df7-4368-bf73-d0b830421d6c nodeName:}" failed. No retries permitted until 2026-04-16 17:41:19.908438954 +0000 UTC m=+16.416829307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret") pod "global-pull-secret-syncer-7t8dz" (UID: "d66127b0-6df7-4368-bf73-d0b830421d6c") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:41:19.912178 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:19.912141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret\") pod \"global-pull-secret-syncer-7t8dz\" (UID: \"d66127b0-6df7-4368-bf73-d0b830421d6c\") " pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:19.912450 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:19.912282 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:41:19.912450 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:19.912376 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret podName:d66127b0-6df7-4368-bf73-d0b830421d6c nodeName:}" failed. No retries permitted until 2026-04-16 17:41:20.912355121 +0000 UTC m=+17.420745263 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret") pod "global-pull-secret-syncer-7t8dz" (UID: "d66127b0-6df7-4368-bf73-d0b830421d6c") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:41:20.117139 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:20.117101 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:20.117313 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:20.117240 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:20.117416 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:20.117320 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:20.117485 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:20.117434 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:20.918284 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:20.918229 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret\") pod \"global-pull-secret-syncer-7t8dz\" (UID: \"d66127b0-6df7-4368-bf73-d0b830421d6c\") " pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:20.918668 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:20.918403 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:41:20.918668 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:20.918471 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret podName:d66127b0-6df7-4368-bf73-d0b830421d6c nodeName:}" failed. No retries permitted until 2026-04-16 17:41:22.918457015 +0000 UTC m=+19.426847160 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret") pod "global-pull-secret-syncer-7t8dz" (UID: "d66127b0-6df7-4368-bf73-d0b830421d6c") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:41:21.117849 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:21.117815 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:21.118019 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:21.117955 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7t8dz" podUID="d66127b0-6df7-4368-bf73-d0b830421d6c" Apr 16 17:41:21.725789 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:21.725752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs\") pod \"network-metrics-daemon-gg8gs\" (UID: \"eccdd8a8-ee59-4c3c-852e-f012ce698554\") " pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:21.725972 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:21.725910 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:21.726027 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:21.725986 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs podName:eccdd8a8-ee59-4c3c-852e-f012ce698554 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:37.72596794 +0000 UTC m=+34.234358085 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs") pod "network-metrics-daemon-gg8gs" (UID: "eccdd8a8-ee59-4c3c-852e-f012ce698554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:21.826663 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:21.826624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxtd5\" (UniqueName: \"kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5\") pod \"network-check-target-vwz6h\" (UID: \"2436cc07-66d7-4793-9260-5c3585aae363\") " pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:21.826838 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:21.826800 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:41:21.826952 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:21.826824 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:41:21.829142 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:21.827154 2577 projected.go:194] Error preparing data for projected volume kube-api-access-cxtd5 for pod openshift-network-diagnostics/network-check-target-vwz6h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:21.829142 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:21.827287 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5 podName:2436cc07-66d7-4793-9260-5c3585aae363 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:37.827258882 +0000 UTC m=+34.335649039 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxtd5" (UniqueName: "kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5") pod "network-check-target-vwz6h" (UID: "2436cc07-66d7-4793-9260-5c3585aae363") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:22.117826 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:22.117646 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:22.117826 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:22.117671 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:22.117826 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:22.117787 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:22.118361 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:22.117943 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:22.934195 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:22.934158 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret\") pod \"global-pull-secret-syncer-7t8dz\" (UID: \"d66127b0-6df7-4368-bf73-d0b830421d6c\") " pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:22.934407 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:22.934358 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:41:22.934477 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:22.934441 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret podName:d66127b0-6df7-4368-bf73-d0b830421d6c nodeName:}" failed. No retries permitted until 2026-04-16 17:41:26.934419515 +0000 UTC m=+23.442809707 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret") pod "global-pull-secret-syncer-7t8dz" (UID: "d66127b0-6df7-4368-bf73-d0b830421d6c") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:41:23.117917 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:23.117881 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:23.118299 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:23.117987 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7t8dz" podUID="d66127b0-6df7-4368-bf73-d0b830421d6c" Apr 16 17:41:24.073524 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:24.073356 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf243dd2e_6d7f_4c1b_9ec7_346a02c79bba.slice/crio-conmon-a820596aa8005f2312f350f95d1c39157f51acb3f9acb0ab64374da037812511.scope\": RecentStats: unable to find data in memory cache]" Apr 16 17:41:24.117823 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.117793 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:24.117946 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:24.117899 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:24.118500 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.117989 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:24.118500 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:24.118131 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:24.221994 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.221968 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pv5jg" event={"ID":"8643560d-c751-40a2-a84e-fd9619f0a198","Type":"ContainerStarted","Data":"2e29e0a0c80def798b7c1f49c7b687822bd8baca731885853c329956cc349ee1"} Apr 16 17:41:24.223131 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.223110 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2xwww" event={"ID":"3d913013-d26e-4756-9b14-4e6907f4baf0","Type":"ContainerStarted","Data":"e2b79ae9e0ae06aaf244c19288d188e1c56a1175ee83d0f968f7c9240526667a"} Apr 16 17:41:24.224272 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.224254 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vrkbq" event={"ID":"2e01f328-7d13-47e4-ba26-d47919ca94fb","Type":"ContainerStarted","Data":"aab4e9f4ac4656fb349961922ab0f8e9cc8e041ea3d9e3fb5ca1e08efe0b1987"} Apr 16 17:41:24.225609 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.225590 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" event={"ID":"2a931acd-9936-4d4e-a3b6-d2d86cb92da4","Type":"ContainerStarted","Data":"52549446b7b33a69c17592e0f4aaa6035067716aa710408f39ae7b1f2f34ea6f"} Apr 16 17:41:24.226715 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.226695 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" event={"ID":"f144295e-123d-49ad-96f0-a793fc10f2bd","Type":"ContainerStarted","Data":"cbbdf523fd28bae92eb86dd58e2bffa5a3f1017cbe6e6fe9c0b896f2598bcf18"} Apr 16 17:41:24.227896 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.227879 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" event={"ID":"b278f078-c804-48cf-b77a-2509deb41cc2","Type":"ContainerStarted","Data":"4eb1ba7d172a6533ee1b801ef4e7300cf8b5a34276df21205b261cd2f7e31c3e"} Apr 16 17:41:24.228964 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.228945 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q4vbk" event={"ID":"6ed2c2e6-5851-4969-afa8-f8336c09ee54","Type":"ContainerStarted","Data":"e9a3dad7e6ae2beb940f6a72b1bbd26602f6a297f0fb619ded0770d1ca13c24b"} Apr 16 17:41:24.230175 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.230157 2577 generic.go:358] "Generic (PLEG): container finished" podID="f243dd2e-6d7f-4c1b-9ec7-346a02c79bba" containerID="a820596aa8005f2312f350f95d1c39157f51acb3f9acb0ab64374da037812511" exitCode=0 Apr 16 17:41:24.230246 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.230183 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s72ln" event={"ID":"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba","Type":"ContainerDied","Data":"a820596aa8005f2312f350f95d1c39157f51acb3f9acb0ab64374da037812511"} Apr 16 17:41:24.236602 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.236568 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pv5jg" podStartSLOduration=3.352003088 podStartE2EDuration="20.236557416s" podCreationTimestamp="2026-04-16 17:41:04 +0000 UTC" firstStartedPulling="2026-04-16 17:41:06.717104061 +0000 UTC m=+3.225494202" lastFinishedPulling="2026-04-16 17:41:23.601658382 +0000 UTC m=+20.110048530" observedRunningTime="2026-04-16 17:41:24.235953471 +0000 UTC m=+20.744343634" watchObservedRunningTime="2026-04-16 17:41:24.236557416 +0000 UTC m=+20.744947578" Apr 16 17:41:24.248917 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.248879 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-q4vbk" podStartSLOduration=3.327563207 podStartE2EDuration="20.248864928s" podCreationTimestamp="2026-04-16 17:41:04 +0000 UTC" firstStartedPulling="2026-04-16 17:41:06.693180175 +0000 UTC m=+3.201570315" lastFinishedPulling="2026-04-16 17:41:23.614481893 +0000 UTC m=+20.122872036" observedRunningTime="2026-04-16 17:41:24.248297592 +0000 UTC m=+20.756687755" watchObservedRunningTime="2026-04-16 17:41:24.248864928 +0000 UTC m=+20.757255088" Apr 16 17:41:24.260240 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.260198 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dfv6z" podStartSLOduration=3.425068246 podStartE2EDuration="20.260184771s" podCreationTimestamp="2026-04-16 17:41:04 +0000 UTC" firstStartedPulling="2026-04-16 17:41:06.717152612 +0000 UTC m=+3.225542757" lastFinishedPulling="2026-04-16 17:41:23.552269135 +0000 UTC m=+20.060659282" observedRunningTime="2026-04-16 17:41:24.259870615 +0000 UTC m=+20.768260776" watchObservedRunningTime="2026-04-16 17:41:24.260184771 +0000 UTC m=+20.768574932" Apr 16 17:41:24.272207 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.272168 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2xwww" podStartSLOduration=3.411212475 podStartE2EDuration="20.272154543s" podCreationTimestamp="2026-04-16 17:41:04 +0000 UTC" firstStartedPulling="2026-04-16 17:41:06.691324666 +0000 UTC m=+3.199714820" lastFinishedPulling="2026-04-16 17:41:23.552266737 +0000 UTC m=+20.060656888" observedRunningTime="2026-04-16 17:41:24.272035524 +0000 UTC m=+20.780425686" watchObservedRunningTime="2026-04-16 17:41:24.272154543 +0000 UTC m=+20.780544705" Apr 16 17:41:24.300619 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:24.300568 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vrkbq" podStartSLOduration=2.387156595 podStartE2EDuration="19.300548368s" podCreationTimestamp="2026-04-16 17:41:05 +0000 UTC" firstStartedPulling="2026-04-16 17:41:06.689681578 +0000 UTC m=+3.198071718" lastFinishedPulling="2026-04-16 17:41:23.60307335 +0000 UTC m=+20.111463491" observedRunningTime="2026-04-16 17:41:24.30008062 +0000 UTC m=+20.808470784" watchObservedRunningTime="2026-04-16 17:41:24.300548368 +0000 UTC m=+20.808938530" Apr 16 17:41:25.118507 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:25.118189 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:25.119030 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:25.118581 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7t8dz" podUID="d66127b0-6df7-4368-bf73-d0b830421d6c" Apr 16 17:41:25.233869 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:25.233811 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 17:41:25.235427 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:25.235411 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-acl-logging/0.log" Apr 16 17:41:25.235717 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:25.235690 2577 generic.go:358] "Generic (PLEG): container finished" podID="2a931acd-9936-4d4e-a3b6-d2d86cb92da4" containerID="fcac339e5b627be6b789935ddb726e59548df7d39dd8a668e627ebb432d23747" exitCode=1 Apr 16 17:41:25.235796 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:25.235777 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" event={"ID":"2a931acd-9936-4d4e-a3b6-d2d86cb92da4","Type":"ContainerStarted","Data":"21906e6cb9bd07c7d70bafdd59d222705414526f412e360b493743f72a2519d2"} Apr 16 17:41:25.235831 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:25.235810 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" event={"ID":"2a931acd-9936-4d4e-a3b6-d2d86cb92da4","Type":"ContainerStarted","Data":"14fbc9e9a33a9b370c2c8f5e2d89c6a11daeaa4ce4950b8046df1b23835c5854"} Apr 16 17:41:25.235831 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:25.235824 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" event={"ID":"2a931acd-9936-4d4e-a3b6-d2d86cb92da4","Type":"ContainerStarted","Data":"355c92edc1ef953895f233c8b85f3d637bdda890eab40c267ef4b4952b514fec"} Apr 16 17:41:25.235906 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:25.235837 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" event={"ID":"2a931acd-9936-4d4e-a3b6-d2d86cb92da4","Type":"ContainerStarted","Data":"9dbd30525357c9bd745a45393e00672a0ec623a6bae1f1e4fb4cbcf3a5158a33"} Apr 16 17:41:25.235906 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:25.235850 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" event={"ID":"2a931acd-9936-4d4e-a3b6-d2d86cb92da4","Type":"ContainerDied","Data":"fcac339e5b627be6b789935ddb726e59548df7d39dd8a668e627ebb432d23747"} Apr 16 17:41:25.237629 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:25.237608 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" event={"ID":"b278f078-c804-48cf-b77a-2509deb41cc2","Type":"ContainerStarted","Data":"abad3d5c7367dfe5939a0a9f6346767678bd5cfd66a7742c8c438d3d2b426426"} Apr 16 17:41:25.239404 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:25.239382 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2krk7" event={"ID":"24061e65-3c69-48d6-8110-9c66fb64e102","Type":"ContainerStarted","Data":"ee89fd7cfa5cc55eaa2844172f35f50eded992b591240408159fdd9328138abc"} Apr 16 17:41:25.249840 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:25.249785 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2krk7" podStartSLOduration=4.370940416 podStartE2EDuration="21.249770802s" podCreationTimestamp="2026-04-16 17:41:04 +0000 UTC" firstStartedPulling="2026-04-16 17:41:06.722257921 +0000 UTC m=+3.230648061" lastFinishedPulling="2026-04-16 17:41:23.601088305 +0000 UTC m=+20.109478447" observedRunningTime="2026-04-16 17:41:25.249300828 +0000 UTC m=+21.757690990" watchObservedRunningTime="2026-04-16 17:41:25.249770802 +0000 UTC m=+21.758160964" Apr 16 17:41:26.061211 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:26.061100 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T17:41:25.233823483Z","UUID":"f63cbf00-eedf-4476-abe3-eae3c2974ab1","Handler":null,"Name":"","Endpoint":""} Apr 16 17:41:26.063226 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:26.063200 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 17:41:26.063226 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:26.063233 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 17:41:26.118559 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:26.118027 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:26.118559 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:26.118132 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:26.118559 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:26.118422 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:26.118559 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:26.118499 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:26.969122 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:26.969085 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret\") pod \"global-pull-secret-syncer-7t8dz\" (UID: \"d66127b0-6df7-4368-bf73-d0b830421d6c\") " pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:26.969293 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:26.969250 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:41:26.969392 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:26.969346 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret podName:d66127b0-6df7-4368-bf73-d0b830421d6c nodeName:}" failed. No retries permitted until 2026-04-16 17:41:34.969309286 +0000 UTC m=+31.477699447 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret") pod "global-pull-secret-syncer-7t8dz" (UID: "d66127b0-6df7-4368-bf73-d0b830421d6c") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:41:27.117585 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:27.117546 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:27.117735 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:27.117680 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7t8dz" podUID="d66127b0-6df7-4368-bf73-d0b830421d6c" Apr 16 17:41:27.245131 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:27.245045 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" event={"ID":"b278f078-c804-48cf-b77a-2509deb41cc2","Type":"ContainerStarted","Data":"a1b100dac3e1cd83d66c050230481a0fb13bcae19735165cd3d2a3afa31d0e78"} Apr 16 17:41:27.268995 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:27.268950 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qbj6z" podStartSLOduration=3.661091499 podStartE2EDuration="23.268935516s" podCreationTimestamp="2026-04-16 17:41:04 +0000 UTC" firstStartedPulling="2026-04-16 17:41:06.694847952 +0000 UTC m=+3.203238107" lastFinishedPulling="2026-04-16 17:41:26.302691977 +0000 UTC m=+22.811082124" observedRunningTime="2026-04-16 17:41:27.268551111 +0000 UTC m=+23.776941272" watchObservedRunningTime="2026-04-16 17:41:27.268935516 +0000 UTC m=+23.777325678" Apr 16 17:41:27.662461 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:27.662176 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2xwww" Apr 16 17:41:27.662879 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:27.662859 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2xwww" Apr 16 17:41:28.117385 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:28.117275 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:28.117556 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:28.117424 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:28.117556 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:28.117455 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:28.117556 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:28.117500 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:28.250078 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:28.250056 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-acl-logging/0.log" Apr 16 17:41:29.117671 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:29.117638 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:29.117885 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:29.117743 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7t8dz" podUID="d66127b0-6df7-4368-bf73-d0b830421d6c" Apr 16 17:41:29.254189 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:29.254156 2577 generic.go:358] "Generic (PLEG): container finished" podID="f243dd2e-6d7f-4c1b-9ec7-346a02c79bba" containerID="3744e676cdcc32a34f4d2fcf1fbcc96062bcd4a470f38bffe1ab17f57a568b2e" exitCode=0 Apr 16 17:41:29.254848 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:29.254241 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s72ln" event={"ID":"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba","Type":"ContainerDied","Data":"3744e676cdcc32a34f4d2fcf1fbcc96062bcd4a470f38bffe1ab17f57a568b2e"} Apr 16 17:41:29.256964 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:29.256917 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-acl-logging/0.log" Apr 16 17:41:29.257381 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:29.257364 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" event={"ID":"2a931acd-9936-4d4e-a3b6-d2d86cb92da4","Type":"ContainerStarted","Data":"020aeba9af77972fa788eb56bc65f4418238444011be2d77204806dd59a6e275"} Apr 16 17:41:30.117998 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:30.117961 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:30.118131 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:30.117965 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:30.118131 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:30.118072 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:30.118225 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:30.118168 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:30.261113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:30.261030 2577 generic.go:358] "Generic (PLEG): container finished" podID="f243dd2e-6d7f-4c1b-9ec7-346a02c79bba" containerID="32f48be854a76acc84edefd40cb726e069011662c7e748f303fdfba545b68d43" exitCode=0 Apr 16 17:41:30.261113 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:30.261091 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s72ln" event={"ID":"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba","Type":"ContainerDied","Data":"32f48be854a76acc84edefd40cb726e069011662c7e748f303fdfba545b68d43"} Apr 16 17:41:31.117886 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:31.117643 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:31.118029 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:31.117911 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7t8dz" podUID="d66127b0-6df7-4368-bf73-d0b830421d6c" Apr 16 17:41:31.266705 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:31.266671 2577 generic.go:358] "Generic (PLEG): container finished" podID="f243dd2e-6d7f-4c1b-9ec7-346a02c79bba" containerID="18ef819e34213c13559d3693431296fbebccd6aa4f31fcc7b56b6fd9a00b61ea" exitCode=0 Apr 16 17:41:31.267225 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:31.266733 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s72ln" event={"ID":"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba","Type":"ContainerDied","Data":"18ef819e34213c13559d3693431296fbebccd6aa4f31fcc7b56b6fd9a00b61ea"} Apr 16 17:41:31.269951 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:31.269936 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-acl-logging/0.log" Apr 16 17:41:31.270308 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:31.270281 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" event={"ID":"2a931acd-9936-4d4e-a3b6-d2d86cb92da4","Type":"ContainerStarted","Data":"f47d27cc6a1932aca65dc953701e32f5bb2a55283b8fd6a64f55949ea85906e5"} Apr 16 17:41:31.270541 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:31.270523 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:31.270604 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:31.270555 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:31.270752 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:31.270733 2577 scope.go:117] "RemoveContainer" containerID="fcac339e5b627be6b789935ddb726e59548df7d39dd8a668e627ebb432d23747" Apr 16 17:41:31.286231 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:31.286209 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:32.120996 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:32.120963 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:32.121178 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:32.120962 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:32.121178 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:32.121102 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:32.121178 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:32.121156 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:32.276310 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:32.276282 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-acl-logging/0.log" Apr 16 17:41:32.276769 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:32.276640 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" event={"ID":"2a931acd-9936-4d4e-a3b6-d2d86cb92da4","Type":"ContainerStarted","Data":"27a3b9551b3d2f01f4424dabde404b88ea08b90ff8ff34aaddc6b94a9f6d8715"} Apr 16 17:41:32.277108 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:32.277083 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:32.295225 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:32.295111 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:41:32.310182 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:32.310125 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" podStartSLOduration=10.94304073 podStartE2EDuration="28.310108805s" podCreationTimestamp="2026-04-16 17:41:04 +0000 UTC" firstStartedPulling="2026-04-16 17:41:06.717113054 +0000 UTC m=+3.225503200" lastFinishedPulling="2026-04-16 17:41:24.084180943 +0000 UTC m=+20.592571275" observedRunningTime="2026-04-16 17:41:32.309962325 +0000 UTC m=+28.818352499" watchObservedRunningTime="2026-04-16 17:41:32.310108805 +0000 UTC m=+28.818498966" Apr 16 17:41:32.772558 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:32.772527 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7t8dz"] Apr 16 17:41:32.772740 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:32.772671 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:32.772801 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:32.772775 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7t8dz" podUID="d66127b0-6df7-4368-bf73-d0b830421d6c" Apr 16 17:41:32.775603 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:32.775579 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gg8gs"] Apr 16 17:41:32.775758 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:32.775744 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:32.775886 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:32.775865 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:32.788654 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:32.788620 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vwz6h"] Apr 16 17:41:32.788820 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:32.788732 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:32.789455 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:32.788832 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:34.123288 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:34.122782 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:34.123288 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:34.122897 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:34.877399 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:34.876960 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2xwww" Apr 16 17:41:34.877399 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:34.877105 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 17:41:34.877827 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:34.877651 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2xwww" Apr 16 17:41:35.034565 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:35.034379 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret\") pod \"global-pull-secret-syncer-7t8dz\" (UID: \"d66127b0-6df7-4368-bf73-d0b830421d6c\") " pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:35.034716 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:35.034520 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:41:35.034716 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:35.034682 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret podName:d66127b0-6df7-4368-bf73-d0b830421d6c nodeName:}" failed. No retries permitted until 2026-04-16 17:41:51.03466383 +0000 UTC m=+47.543053991 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret") pod "global-pull-secret-syncer-7t8dz" (UID: "d66127b0-6df7-4368-bf73-d0b830421d6c") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:41:35.117899 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:35.117813 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:35.118057 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:35.117813 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:35.118057 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:35.117947 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7t8dz" podUID="d66127b0-6df7-4368-bf73-d0b830421d6c" Apr 16 17:41:35.118147 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:35.118062 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vwz6h" podUID="2436cc07-66d7-4793-9260-5c3585aae363" Apr 16 17:41:36.117778 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.117743 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:36.118222 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:36.117898 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:41:36.786219 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.786190 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-234.ec2.internal" event="NodeReady" Apr 16 17:41:36.786437 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.786341 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 17:41:36.817857 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.817825 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-767977f9c4-tqv6c"] Apr 16 17:41:36.834836 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.834809 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-767977f9c4-tqv6c"] Apr 16 17:41:36.835003 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.834940 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:36.837302 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.836729 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zgfj4"] Apr 16 17:41:36.837302 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.837027 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 17:41:36.837302 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.837246 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 17:41:36.837607 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.837575 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vf8ds\"" Apr 16 17:41:36.837989 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.837785 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 17:41:36.843613 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.843519 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 17:41:36.859566 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.859535 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6l87f"] Apr 16 17:41:36.859715 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.859692 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:41:36.861671 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.861636 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 17:41:36.861671 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.861641 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 17:41:36.861883 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.861651 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w8phs\"" Apr 16 17:41:36.861883 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.861778 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 17:41:36.880709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.880657 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl"] Apr 16 17:41:36.880858 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.880842 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:36.882849 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.882829 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 17:41:36.882849 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.882837 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2vlxb\"" Apr 16 17:41:36.883120 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.883107 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 17:41:36.902992 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.902963 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-zll94"] Apr 16 17:41:36.903110 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.903033 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:41:36.904982 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.904962 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 17:41:36.905079 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.905024 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-bfzrx\"" Apr 16 17:41:36.905123 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.905093 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 17:41:36.921035 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.921013 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zgfj4"] Apr 16 17:41:36.921035 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.921036 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-zll94"] Apr 16 17:41:36.921035 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.921045 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl"] Apr 16 17:41:36.921035 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.921053 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6l87f"] Apr 16 17:41:36.921283 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.921164 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zll94" Apr 16 17:41:36.923129 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.923109 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-9w77h\"" Apr 16 17:41:36.923245 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.923135 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 17:41:36.923245 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.923115 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 17:41:36.947995 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.947962 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:36.948185 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.948023 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsxf6\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-kube-api-access-jsxf6\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:36.948185 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.948106 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b98fec4-6489-4373-b88e-c49c1e82c443-ca-trust-extracted\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:36.948185 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.948140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b98fec4-6489-4373-b88e-c49c1e82c443-trusted-ca\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:36.948185 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.948175 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnc8n\" (UniqueName: \"kubernetes.io/projected/573f0e79-0a24-47b1-9570-15a67f037365-kube-api-access-vnc8n\") pod \"ingress-canary-zgfj4\" (UID: \"573f0e79-0a24-47b1-9570-15a67f037365\") " pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:41:36.948430 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.948205 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert\") pod \"ingress-canary-zgfj4\" (UID: \"573f0e79-0a24-47b1-9570-15a67f037365\") " pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:41:36.948430 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.948232 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b98fec4-6489-4373-b88e-c49c1e82c443-installation-pull-secrets\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:36.948430 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.948278 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-certificates\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:36.948430 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.948318 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-bound-sa-token\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:36.948430 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:36.948387 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b98fec4-6489-4373-b88e-c49c1e82c443-image-registry-private-configuration\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.049504 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.049483 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0738358-399f-4f84-8552-0728eba20372-config-volume\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:37.049574 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.049532 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsxf6\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-kube-api-access-jsxf6\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.049574 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.049554 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bcl8\" (UniqueName: \"kubernetes.io/projected/f5699995-82fb-44e3-a47d-70164f1e97cd-kube-api-access-2bcl8\") pod \"network-check-source-7b678d77c7-zll94\" (UID: \"f5699995-82fb-44e3-a47d-70164f1e97cd\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zll94" Apr 16 17:41:37.049653 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.049634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b98fec4-6489-4373-b88e-c49c1e82c443-ca-trust-extracted\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.049816 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.049674 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b98fec4-6489-4373-b88e-c49c1e82c443-trusted-ca\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.049816 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.049707 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-46ktl\" (UID: \"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:41:37.049816 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.049737 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-46ktl\" (UID: \"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:41:37.049816 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.049772 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnc8n\" (UniqueName: \"kubernetes.io/projected/573f0e79-0a24-47b1-9570-15a67f037365-kube-api-access-vnc8n\") pod \"ingress-canary-zgfj4\" (UID: \"573f0e79-0a24-47b1-9570-15a67f037365\") " pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:41:37.050005 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.049839 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert\") pod \"ingress-canary-zgfj4\" (UID: \"573f0e79-0a24-47b1-9570-15a67f037365\") " pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:41:37.050005 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.049908 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b98fec4-6489-4373-b88e-c49c1e82c443-installation-pull-secrets\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.050005 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.049940 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:37.050005 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.049973 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:41:37.050005 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.049975 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b98fec4-6489-4373-b88e-c49c1e82c443-ca-trust-extracted\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.050005 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.049984 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-certificates\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.050272 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.050033 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert podName:573f0e79-0a24-47b1-9570-15a67f037365 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:37.550014526 +0000 UTC m=+34.058404666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert") pod "ingress-canary-zgfj4" (UID: "573f0e79-0a24-47b1-9570-15a67f037365") : secret "canary-serving-cert" not found Apr 16 17:41:37.050272 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.050056 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-bound-sa-token\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.050272 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.050081 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b98fec4-6489-4373-b88e-c49c1e82c443-image-registry-private-configuration\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.050272 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.050112 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d0738358-399f-4f84-8552-0728eba20372-tmp-dir\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:37.050272 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.050136 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbq2n\" (UniqueName: \"kubernetes.io/projected/d0738358-399f-4f84-8552-0728eba20372-kube-api-access-kbq2n\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:37.050272 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.050185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.050570 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.050389 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:41:37.050570 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.050403 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-767977f9c4-tqv6c: secret "image-registry-tls" not found Apr 16 17:41:37.050570 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.050459 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls podName:1b98fec4-6489-4373-b88e-c49c1e82c443 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:37.550441211 +0000 UTC m=+34.058831371 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls") pod "image-registry-767977f9c4-tqv6c" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443") : secret "image-registry-tls" not found Apr 16 17:41:37.050570 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.050508 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-certificates\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.050740 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.050626 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b98fec4-6489-4373-b88e-c49c1e82c443-trusted-ca\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.054126 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.054105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b98fec4-6489-4373-b88e-c49c1e82c443-installation-pull-secrets\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.054232 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.054131 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b98fec4-6489-4373-b88e-c49c1e82c443-image-registry-private-configuration\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.059895 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.059872 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsxf6\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-kube-api-access-jsxf6\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.060755 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.060725 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-bound-sa-token\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.061181 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.061160 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnc8n\" (UniqueName: \"kubernetes.io/projected/573f0e79-0a24-47b1-9570-15a67f037365-kube-api-access-vnc8n\") pod \"ingress-canary-zgfj4\" (UID: \"573f0e79-0a24-47b1-9570-15a67f037365\") " pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:41:37.118127 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.118099 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:37.118688 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.118131 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:37.120250 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.120233 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qnscp\"" Apr 16 17:41:37.120351 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.120233 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 17:41:37.151202 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.151110 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbq2n\" (UniqueName: \"kubernetes.io/projected/d0738358-399f-4f84-8552-0728eba20372-kube-api-access-kbq2n\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:37.151368 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.151204 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0738358-399f-4f84-8552-0728eba20372-config-volume\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:37.151368 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.151246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bcl8\" (UniqueName: \"kubernetes.io/projected/f5699995-82fb-44e3-a47d-70164f1e97cd-kube-api-access-2bcl8\") pod \"network-check-source-7b678d77c7-zll94\" (UID: \"f5699995-82fb-44e3-a47d-70164f1e97cd\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zll94" Apr 16 17:41:37.151368 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.151281 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-46ktl\" (UID: \"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:41:37.151368 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.151313 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-46ktl\" (UID: \"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:41:37.151579 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.151385 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:37.151579 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.151441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d0738358-399f-4f84-8552-0728eba20372-tmp-dir\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:37.151579 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.151561 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:41:37.151723 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.151606 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:41:37.151723 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.151628 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert podName:ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:37.65160928 +0000 UTC m=+34.159999424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-46ktl" (UID: "ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1") : secret "networking-console-plugin-cert" not found Apr 16 17:41:37.151723 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.151659 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls podName:d0738358-399f-4f84-8552-0728eba20372 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:37.651642379 +0000 UTC m=+34.160032524 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls") pod "dns-default-6l87f" (UID: "d0738358-399f-4f84-8552-0728eba20372") : secret "dns-default-metrics-tls" not found Apr 16 17:41:37.151871 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.151732 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d0738358-399f-4f84-8552-0728eba20372-tmp-dir\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:37.151871 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.151857 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0738358-399f-4f84-8552-0728eba20372-config-volume\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:37.152056 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.152033 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-46ktl\" (UID: \"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:41:37.159292 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.159271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbq2n\" (UniqueName: \"kubernetes.io/projected/d0738358-399f-4f84-8552-0728eba20372-kube-api-access-kbq2n\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:37.159517 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.159496 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bcl8\" (UniqueName: \"kubernetes.io/projected/f5699995-82fb-44e3-a47d-70164f1e97cd-kube-api-access-2bcl8\") pod \"network-check-source-7b678d77c7-zll94\" (UID: \"f5699995-82fb-44e3-a47d-70164f1e97cd\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zll94" Apr 16 17:41:37.229370 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.229317 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zll94" Apr 16 17:41:37.288779 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.288687 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s72ln" event={"ID":"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba","Type":"ContainerStarted","Data":"15b56288a0ef2a8ecf52e2680fb48c537ea83813c125dc653ed0ac5b22edbc8e"} Apr 16 17:41:37.385385 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.385112 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-zll94"] Apr 16 17:41:37.389453 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:37.389312 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5699995_82fb_44e3_a47d_70164f1e97cd.slice/crio-f8bdc039f815237496b12df36b7d62fa0f44f44c808a7cba00ccb4c0cb789b24 WatchSource:0}: Error finding container f8bdc039f815237496b12df36b7d62fa0f44f44c808a7cba00ccb4c0cb789b24: Status 404 returned error can't find the container with id f8bdc039f815237496b12df36b7d62fa0f44f44c808a7cba00ccb4c0cb789b24 Apr 16 17:41:37.554510 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.554410 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert\") pod \"ingress-canary-zgfj4\" (UID: \"573f0e79-0a24-47b1-9570-15a67f037365\") " pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:41:37.554510 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.554488 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:37.554689 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.554563 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:41:37.554689 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.554602 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:41:37.554689 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.554613 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-767977f9c4-tqv6c: secret "image-registry-tls" not found Apr 16 17:41:37.554689 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.554633 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert podName:573f0e79-0a24-47b1-9570-15a67f037365 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:38.554613125 +0000 UTC m=+35.063003284 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert") pod "ingress-canary-zgfj4" (UID: "573f0e79-0a24-47b1-9570-15a67f037365") : secret "canary-serving-cert" not found Apr 16 17:41:37.554689 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.554647 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls podName:1b98fec4-6489-4373-b88e-c49c1e82c443 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:38.554638109 +0000 UTC m=+35.063028269 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls") pod "image-registry-767977f9c4-tqv6c" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443") : secret "image-registry-tls" not found Apr 16 17:41:37.655349 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.655299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-46ktl\" (UID: \"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:41:37.655523 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.655365 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:37.655523 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.655438 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:41:37.655523 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.655453 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:41:37.655523 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.655500 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls podName:d0738358-399f-4f84-8552-0728eba20372 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:38.655486734 +0000 UTC m=+35.163876874 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls") pod "dns-default-6l87f" (UID: "d0738358-399f-4f84-8552-0728eba20372") : secret "dns-default-metrics-tls" not found Apr 16 17:41:37.655523 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.655512 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert podName:ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:38.655506771 +0000 UTC m=+35.163896911 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-46ktl" (UID: "ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1") : secret "networking-console-plugin-cert" not found Apr 16 17:41:37.755999 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.755968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs\") pod \"network-metrics-daemon-gg8gs\" (UID: \"eccdd8a8-ee59-4c3c-852e-f012ce698554\") " pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:37.756238 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.756122 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:37.756238 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:37.756185 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs podName:eccdd8a8-ee59-4c3c-852e-f012ce698554 nodeName:}" failed. No retries permitted until 2026-04-16 17:42:09.756171056 +0000 UTC m=+66.264561217 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs") pod "network-metrics-daemon-gg8gs" (UID: "eccdd8a8-ee59-4c3c-852e-f012ce698554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:37.857180 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.857137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxtd5\" (UniqueName: \"kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5\") pod \"network-check-target-vwz6h\" (UID: \"2436cc07-66d7-4793-9260-5c3585aae363\") " pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:37.861060 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:37.861035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxtd5\" (UniqueName: \"kubernetes.io/projected/2436cc07-66d7-4793-9260-5c3585aae363-kube-api-access-cxtd5\") pod \"network-check-target-vwz6h\" (UID: \"2436cc07-66d7-4793-9260-5c3585aae363\") " pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:38.037499 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:38.037450 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:38.121414 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:38.121326 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:41:38.123994 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:38.123969 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 17:41:38.124124 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:38.124079 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8xx6l\"" Apr 16 17:41:38.175193 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:38.175165 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vwz6h"] Apr 16 17:41:38.188498 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:38.188438 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2436cc07_66d7_4793_9260_5c3585aae363.slice/crio-abe5aff1ee085956e8ee8355bf20705f81d52c206ef85589c2cfc80fb0a79403 WatchSource:0}: Error finding container abe5aff1ee085956e8ee8355bf20705f81d52c206ef85589c2cfc80fb0a79403: Status 404 returned error can't find the container with id abe5aff1ee085956e8ee8355bf20705f81d52c206ef85589c2cfc80fb0a79403 Apr 16 17:41:38.293077 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:38.293041 2577 generic.go:358] "Generic (PLEG): container finished" podID="f243dd2e-6d7f-4c1b-9ec7-346a02c79bba" containerID="15b56288a0ef2a8ecf52e2680fb48c537ea83813c125dc653ed0ac5b22edbc8e" exitCode=0 Apr 16 17:41:38.293253 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:38.293119 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s72ln" event={"ID":"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba","Type":"ContainerDied","Data":"15b56288a0ef2a8ecf52e2680fb48c537ea83813c125dc653ed0ac5b22edbc8e"} Apr 16 17:41:38.294490 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:38.294230 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vwz6h" event={"ID":"2436cc07-66d7-4793-9260-5c3585aae363","Type":"ContainerStarted","Data":"abe5aff1ee085956e8ee8355bf20705f81d52c206ef85589c2cfc80fb0a79403"} Apr 16 17:41:38.295361 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:38.295321 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zll94" event={"ID":"f5699995-82fb-44e3-a47d-70164f1e97cd","Type":"ContainerStarted","Data":"f8bdc039f815237496b12df36b7d62fa0f44f44c808a7cba00ccb4c0cb789b24"} Apr 16 17:41:38.564008 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:38.563971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:38.564173 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:38.564079 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert\") pod \"ingress-canary-zgfj4\" (UID: \"573f0e79-0a24-47b1-9570-15a67f037365\") " pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:41:38.564173 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:38.564156 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:41:38.564284 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:38.564198 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-767977f9c4-tqv6c: secret "image-registry-tls" not found Apr 16 17:41:38.564284 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:38.564222 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:41:38.564284 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:38.564270 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls podName:1b98fec4-6489-4373-b88e-c49c1e82c443 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:40.564250301 +0000 UTC m=+37.072640453 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls") pod "image-registry-767977f9c4-tqv6c" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443") : secret "image-registry-tls" not found Apr 16 17:41:38.564461 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:38.564301 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert podName:573f0e79-0a24-47b1-9570-15a67f037365 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:40.564283988 +0000 UTC m=+37.072674158 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert") pod "ingress-canary-zgfj4" (UID: "573f0e79-0a24-47b1-9570-15a67f037365") : secret "canary-serving-cert" not found Apr 16 17:41:38.664874 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:38.664780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:38.665046 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:38.664914 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-46ktl\" (UID: \"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:41:38.665046 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:38.664985 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:41:38.665046 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:38.665000 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:41:38.665046 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:38.665045 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls podName:d0738358-399f-4f84-8552-0728eba20372 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:40.665028445 +0000 UTC m=+37.173418591 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls") pod "dns-default-6l87f" (UID: "d0738358-399f-4f84-8552-0728eba20372") : secret "dns-default-metrics-tls" not found Apr 16 17:41:38.665234 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:38.665063 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert podName:ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:40.665053735 +0000 UTC m=+37.173443875 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-46ktl" (UID: "ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1") : secret "networking-console-plugin-cert" not found Apr 16 17:41:39.301090 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:39.301054 2577 generic.go:358] "Generic (PLEG): container finished" podID="f243dd2e-6d7f-4c1b-9ec7-346a02c79bba" containerID="31dd07a22d00389bd85804ea1ab9b8c20410fa5d09e09398880acf970c2687a9" exitCode=0 Apr 16 17:41:39.301602 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:39.301131 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s72ln" event={"ID":"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba","Type":"ContainerDied","Data":"31dd07a22d00389bd85804ea1ab9b8c20410fa5d09e09398880acf970c2687a9"} Apr 16 17:41:40.582180 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:40.581952 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:40.582574 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:40.582110 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:41:40.582574 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:40.582258 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert\") pod \"ingress-canary-zgfj4\" (UID: \"573f0e79-0a24-47b1-9570-15a67f037365\") " pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:41:40.582574 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:40.582277 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-767977f9c4-tqv6c: secret "image-registry-tls" not found Apr 16 17:41:40.582574 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:40.582358 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls podName:1b98fec4-6489-4373-b88e-c49c1e82c443 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:44.58231848 +0000 UTC m=+41.090708636 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls") pod "image-registry-767977f9c4-tqv6c" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443") : secret "image-registry-tls" not found Apr 16 17:41:40.582574 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:40.582427 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:41:40.582574 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:40.582483 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert podName:573f0e79-0a24-47b1-9570-15a67f037365 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:44.582469736 +0000 UTC m=+41.090859877 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert") pod "ingress-canary-zgfj4" (UID: "573f0e79-0a24-47b1-9570-15a67f037365") : secret "canary-serving-cert" not found Apr 16 17:41:40.682948 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:40.682928 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:40.683064 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:40.682998 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-46ktl\" (UID: \"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:41:40.683099 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:40.683087 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:41:40.683133 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:40.683094 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:41:40.683165 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:40.683134 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert podName:ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:44.683119041 +0000 UTC m=+41.191509182 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-46ktl" (UID: "ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1") : secret "networking-console-plugin-cert" not found Apr 16 17:41:40.683165 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:40.683147 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls podName:d0738358-399f-4f84-8552-0728eba20372 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:44.683141356 +0000 UTC m=+41.191531496 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls") pod "dns-default-6l87f" (UID: "d0738358-399f-4f84-8552-0728eba20372") : secret "dns-default-metrics-tls" not found Apr 16 17:41:41.307839 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:41.307801 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s72ln" event={"ID":"f243dd2e-6d7f-4c1b-9ec7-346a02c79bba","Type":"ContainerStarted","Data":"e7b6e3f13651cc918550f3014a9488268e99e7cd5fa9da9bf07a0b4b94f77520"} Apr 16 17:41:41.309086 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:41.309057 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vwz6h" event={"ID":"2436cc07-66d7-4793-9260-5c3585aae363","Type":"ContainerStarted","Data":"a422e33d67aabdb54ba77c0486708f78792c6e095b24e0a6449623b1b1c9c059"} Apr 16 17:41:41.309177 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:41.309138 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:41:41.310279 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:41.310247 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zll94" event={"ID":"f5699995-82fb-44e3-a47d-70164f1e97cd","Type":"ContainerStarted","Data":"12a17b1fd6f6ffb191bb0db9b406ee852b24face0bbb3822f85d078ba8733810"} Apr 16 17:41:41.330544 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:41.330498 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-s72ln" podStartSLOduration=7.023221775 podStartE2EDuration="37.330483524s" podCreationTimestamp="2026-04-16 17:41:04 +0000 UTC" firstStartedPulling="2026-04-16 17:41:06.722500331 +0000 UTC m=+3.230890485" lastFinishedPulling="2026-04-16 17:41:37.029762077 +0000 UTC m=+33.538152234" observedRunningTime="2026-04-16 17:41:41.329649942 +0000 UTC m=+37.838040104" watchObservedRunningTime="2026-04-16 17:41:41.330483524 +0000 UTC m=+37.838873686" Apr 16 17:41:41.344176 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:41.344129 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vwz6h" podStartSLOduration=34.854171122 podStartE2EDuration="37.344117433s" podCreationTimestamp="2026-04-16 17:41:04 +0000 UTC" firstStartedPulling="2026-04-16 17:41:38.189756458 +0000 UTC m=+34.698146599" lastFinishedPulling="2026-04-16 17:41:40.679702769 +0000 UTC m=+37.188092910" observedRunningTime="2026-04-16 17:41:41.343362724 +0000 UTC m=+37.851752904" watchObservedRunningTime="2026-04-16 17:41:41.344117433 +0000 UTC m=+37.852507594" Apr 16 17:41:41.358673 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:41.358613 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zll94" podStartSLOduration=34.080451117 podStartE2EDuration="37.358599552s" podCreationTimestamp="2026-04-16 17:41:04 +0000 UTC" firstStartedPulling="2026-04-16 17:41:37.391284248 +0000 UTC m=+33.899674388" lastFinishedPulling="2026-04-16 17:41:40.669432669 +0000 UTC m=+37.177822823" observedRunningTime="2026-04-16 17:41:41.357668292 +0000 UTC m=+37.866058454" watchObservedRunningTime="2026-04-16 17:41:41.358599552 +0000 UTC m=+37.866989714" Apr 16 17:41:44.616052 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:44.616011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:44.616612 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:44.616098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert\") pod \"ingress-canary-zgfj4\" (UID: \"573f0e79-0a24-47b1-9570-15a67f037365\") " pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:41:44.616612 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:44.616166 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:41:44.616612 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:44.616189 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-767977f9c4-tqv6c: secret "image-registry-tls" not found Apr 16 17:41:44.616612 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:44.616190 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:41:44.616612 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:44.616239 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert podName:573f0e79-0a24-47b1-9570-15a67f037365 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:52.616227183 +0000 UTC m=+49.124617323 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert") pod "ingress-canary-zgfj4" (UID: "573f0e79-0a24-47b1-9570-15a67f037365") : secret "canary-serving-cert" not found Apr 16 17:41:44.616612 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:44.616251 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls podName:1b98fec4-6489-4373-b88e-c49c1e82c443 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:52.616245618 +0000 UTC m=+49.124635758 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls") pod "image-registry-767977f9c4-tqv6c" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443") : secret "image-registry-tls" not found Apr 16 17:41:44.717446 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:44.717401 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-46ktl\" (UID: \"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:41:44.717569 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:44.717464 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:44.717569 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:44.717551 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:41:44.717569 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:44.717552 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:41:44.717656 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:44.717598 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls podName:d0738358-399f-4f84-8552-0728eba20372 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:52.717587632 +0000 UTC m=+49.225977772 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls") pod "dns-default-6l87f" (UID: "d0738358-399f-4f84-8552-0728eba20372") : secret "dns-default-metrics-tls" not found Apr 16 17:41:44.717656 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:44.717610 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert podName:ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:52.717604854 +0000 UTC m=+49.225994994 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-46ktl" (UID: "ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1") : secret "networking-console-plugin-cert" not found Apr 16 17:41:51.061874 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:51.061837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret\") pod \"global-pull-secret-syncer-7t8dz\" (UID: \"d66127b0-6df7-4368-bf73-d0b830421d6c\") " pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:51.065347 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:51.065304 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d66127b0-6df7-4368-bf73-d0b830421d6c-original-pull-secret\") pod \"global-pull-secret-syncer-7t8dz\" (UID: \"d66127b0-6df7-4368-bf73-d0b830421d6c\") " pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:51.254499 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:51.254462 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7t8dz" Apr 16 17:41:51.389750 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:51.389714 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7t8dz"] Apr 16 17:41:51.393420 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:41:51.393394 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd66127b0_6df7_4368_bf73_d0b830421d6c.slice/crio-a72250922ee7e472ed4aecccd8996e8d5f39b7d13149cfe89e455623057356d0 WatchSource:0}: Error finding container a72250922ee7e472ed4aecccd8996e8d5f39b7d13149cfe89e455623057356d0: Status 404 returned error can't find the container with id a72250922ee7e472ed4aecccd8996e8d5f39b7d13149cfe89e455623057356d0 Apr 16 17:41:52.333054 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:52.333014 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7t8dz" event={"ID":"d66127b0-6df7-4368-bf73-d0b830421d6c","Type":"ContainerStarted","Data":"a72250922ee7e472ed4aecccd8996e8d5f39b7d13149cfe89e455623057356d0"} Apr 16 17:41:52.675398 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:52.675299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:41:52.675548 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:52.675412 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert\") pod \"ingress-canary-zgfj4\" (UID: \"573f0e79-0a24-47b1-9570-15a67f037365\") " pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:41:52.675548 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:52.675450 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:41:52.675548 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:52.675473 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-767977f9c4-tqv6c: secret "image-registry-tls" not found Apr 16 17:41:52.675548 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:52.675530 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls podName:1b98fec4-6489-4373-b88e-c49c1e82c443 nodeName:}" failed. No retries permitted until 2026-04-16 17:42:08.675515344 +0000 UTC m=+65.183905484 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls") pod "image-registry-767977f9c4-tqv6c" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443") : secret "image-registry-tls" not found Apr 16 17:41:52.675548 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:52.675528 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:41:52.675753 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:52.675574 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert podName:573f0e79-0a24-47b1-9570-15a67f037365 nodeName:}" failed. No retries permitted until 2026-04-16 17:42:08.675558752 +0000 UTC m=+65.183948892 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert") pod "ingress-canary-zgfj4" (UID: "573f0e79-0a24-47b1-9570-15a67f037365") : secret "canary-serving-cert" not found Apr 16 17:41:52.776287 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:52.776248 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-46ktl\" (UID: \"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:41:52.776501 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:52.776318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:41:52.776501 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:52.776430 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:41:52.776603 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:52.776509 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert podName:ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1 nodeName:}" failed. No retries permitted until 2026-04-16 17:42:08.776489546 +0000 UTC m=+65.284879686 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-46ktl" (UID: "ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1") : secret "networking-console-plugin-cert" not found Apr 16 17:41:52.776603 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:52.776436 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:41:52.776603 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:41:52.776572 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls podName:d0738358-399f-4f84-8552-0728eba20372 nodeName:}" failed. No retries permitted until 2026-04-16 17:42:08.776558852 +0000 UTC m=+65.284948992 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls") pod "dns-default-6l87f" (UID: "d0738358-399f-4f84-8552-0728eba20372") : secret "dns-default-metrics-tls" not found Apr 16 17:41:56.343544 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:56.343504 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7t8dz" event={"ID":"d66127b0-6df7-4368-bf73-d0b830421d6c","Type":"ContainerStarted","Data":"375124e293cb0242154c8d36594df6c887a6283ad924a6e5144350c8c3f24d20"} Apr 16 17:41:56.359041 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:41:56.358982 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7t8dz" podStartSLOduration=33.564657382 podStartE2EDuration="37.35896543s" podCreationTimestamp="2026-04-16 17:41:19 +0000 UTC" firstStartedPulling="2026-04-16 17:41:51.395556813 +0000 UTC m=+47.903946957" lastFinishedPulling="2026-04-16 17:41:55.189864851 +0000 UTC m=+51.698255005" observedRunningTime="2026-04-16 17:41:56.358781403 +0000 UTC m=+52.867171565" watchObservedRunningTime="2026-04-16 17:41:56.35896543 +0000 UTC m=+52.867355593" Apr 16 17:42:04.291341 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:04.291305 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g5src" Apr 16 17:42:07.725135 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.725097 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m"] Apr 16 17:42:07.761295 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.761258 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m"] Apr 16 17:42:07.761463 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.761400 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.763980 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.763959 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 17:42:07.764106 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.763983 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 17:42:07.764106 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.764036 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 17:42:07.764324 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.764303 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 17:42:07.764455 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.764341 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 17:42:07.764455 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.764356 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 17:42:07.764455 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.764362 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 17:42:07.788222 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.788196 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-hub\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.788371 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.788228 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.788371 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.788256 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mspv\" (UniqueName: \"kubernetes.io/projected/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-kube-api-access-7mspv\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.788371 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.788323 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.788371 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.788361 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-ca\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.788507 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.788390 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.889438 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.889398 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.889438 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.889438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-ca\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.889669 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.889486 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.889669 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.889522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-hub\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.889669 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.889549 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.889669 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.889582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mspv\" (UniqueName: \"kubernetes.io/projected/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-kube-api-access-7mspv\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.892046 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.892014 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.892178 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.892159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-ca\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.892235 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.892209 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-hub\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.892295 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.892281 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.897390 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.897368 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mspv\" (UniqueName: \"kubernetes.io/projected/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-kube-api-access-7mspv\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:07.900354 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:07.900323 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5546c7f497-k2f7m\" (UID: \"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:08.082511 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:08.082479 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:42:08.204188 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:08.204158 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m"] Apr 16 17:42:08.207350 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:42:08.207306 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcf73ecf_b5ca_400b_b50d_cbfc375fe2fc.slice/crio-e31bb320b4244480b2d73d6d93eec8782ed4b36773ebfbfdeac483f5ef44f975 WatchSource:0}: Error finding container e31bb320b4244480b2d73d6d93eec8782ed4b36773ebfbfdeac483f5ef44f975: Status 404 returned error can't find the container with id e31bb320b4244480b2d73d6d93eec8782ed4b36773ebfbfdeac483f5ef44f975 Apr 16 17:42:08.368346 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:08.368242 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" event={"ID":"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc","Type":"ContainerStarted","Data":"e31bb320b4244480b2d73d6d93eec8782ed4b36773ebfbfdeac483f5ef44f975"} Apr 16 17:42:08.695666 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:08.695579 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:42:08.695833 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:08.695685 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert\") pod \"ingress-canary-zgfj4\" (UID: \"573f0e79-0a24-47b1-9570-15a67f037365\") " pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:42:08.695833 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:08.695818 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:42:08.695942 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:08.695879 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert podName:573f0e79-0a24-47b1-9570-15a67f037365 nodeName:}" failed. No retries permitted until 2026-04-16 17:42:40.695859804 +0000 UTC m=+97.204249948 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert") pod "ingress-canary-zgfj4" (UID: "573f0e79-0a24-47b1-9570-15a67f037365") : secret "canary-serving-cert" not found Apr 16 17:42:08.696010 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:08.695953 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:42:08.696010 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:08.695965 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-767977f9c4-tqv6c: secret "image-registry-tls" not found Apr 16 17:42:08.696010 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:08.695999 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls podName:1b98fec4-6489-4373-b88e-c49c1e82c443 nodeName:}" failed. No retries permitted until 2026-04-16 17:42:40.695987944 +0000 UTC m=+97.204378086 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls") pod "image-registry-767977f9c4-tqv6c" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443") : secret "image-registry-tls" not found Apr 16 17:42:08.796278 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:08.796237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-46ktl\" (UID: \"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:42:08.796765 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:08.796317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:42:08.796765 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:08.796398 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:42:08.796765 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:08.796447 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:42:08.796765 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:08.796475 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert podName:ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1 nodeName:}" failed. No retries permitted until 2026-04-16 17:42:40.79645382 +0000 UTC m=+97.304843973 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-46ktl" (UID: "ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1") : secret "networking-console-plugin-cert" not found Apr 16 17:42:08.796765 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:08.796493 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls podName:d0738358-399f-4f84-8552-0728eba20372 nodeName:}" failed. No retries permitted until 2026-04-16 17:42:40.79648185 +0000 UTC m=+97.304871990 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls") pod "dns-default-6l87f" (UID: "d0738358-399f-4f84-8552-0728eba20372") : secret "dns-default-metrics-tls" not found Apr 16 17:42:09.805727 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:09.805687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs\") pod \"network-metrics-daemon-gg8gs\" (UID: \"eccdd8a8-ee59-4c3c-852e-f012ce698554\") " pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:42:09.807973 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:09.807949 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 17:42:09.816543 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:09.816513 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 17:42:09.816689 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:09.816587 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs podName:eccdd8a8-ee59-4c3c-852e-f012ce698554 nodeName:}" failed. No retries permitted until 2026-04-16 17:43:13.8165701 +0000 UTC m=+130.324960241 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs") pod "network-metrics-daemon-gg8gs" (UID: "eccdd8a8-ee59-4c3c-852e-f012ce698554") : secret "metrics-daemon-secret" not found Apr 16 17:42:11.376093 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:11.376051 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" event={"ID":"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc","Type":"ContainerStarted","Data":"cdf19a6efc5adbd69e7470aca16889e938f10409bf6fcd88e80acbb9fa95533b"} Apr 16 17:42:12.314832 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:12.314805 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vwz6h" Apr 16 17:42:12.380470 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:12.380437 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" event={"ID":"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc","Type":"ContainerStarted","Data":"2f229045f2fc0b984d1877d12679008892a8e62c7c039ab15a9dc0c1a1cf76be"} Apr 16 17:42:12.380470 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:12.380473 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" event={"ID":"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc","Type":"ContainerStarted","Data":"bbc91a68290234cb33f3ee251a735b9ad9e9836876f866b905df7bd1233fb6bb"} Apr 16 17:42:12.399533 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:12.399480 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" podStartSLOduration=1.381411983 podStartE2EDuration="5.399464879s" podCreationTimestamp="2026-04-16 17:42:07 +0000 UTC" firstStartedPulling="2026-04-16 17:42:08.209068253 +0000 UTC m=+64.717458394" lastFinishedPulling="2026-04-16 17:42:12.227121137 +0000 UTC m=+68.735511290" observedRunningTime="2026-04-16 17:42:12.398202042 +0000 UTC m=+68.906592206" watchObservedRunningTime="2026-04-16 17:42:12.399464879 +0000 UTC m=+68.907855041" Apr 16 17:42:40.742156 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:40.742114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:42:40.742707 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:40.742189 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert\") pod \"ingress-canary-zgfj4\" (UID: \"573f0e79-0a24-47b1-9570-15a67f037365\") " pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:42:40.742707 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:40.742277 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:42:40.742707 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:40.742302 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-767977f9c4-tqv6c: secret "image-registry-tls" not found Apr 16 17:42:40.742707 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:40.742276 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:42:40.742707 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:40.742403 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert podName:573f0e79-0a24-47b1-9570-15a67f037365 nodeName:}" failed. No retries permitted until 2026-04-16 17:43:44.742371499 +0000 UTC m=+161.250761639 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert") pod "ingress-canary-zgfj4" (UID: "573f0e79-0a24-47b1-9570-15a67f037365") : secret "canary-serving-cert" not found Apr 16 17:42:40.742707 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:40.742422 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls podName:1b98fec4-6489-4373-b88e-c49c1e82c443 nodeName:}" failed. No retries permitted until 2026-04-16 17:43:44.742415805 +0000 UTC m=+161.250805945 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls") pod "image-registry-767977f9c4-tqv6c" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443") : secret "image-registry-tls" not found Apr 16 17:42:40.843192 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:40.843152 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-46ktl\" (UID: \"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:42:40.843374 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:42:40.843204 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:42:40.843374 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:40.843292 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:42:40.843374 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:40.843299 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:42:40.843374 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:40.843360 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls podName:d0738358-399f-4f84-8552-0728eba20372 nodeName:}" failed. No retries permitted until 2026-04-16 17:43:44.843346207 +0000 UTC m=+161.351736360 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls") pod "dns-default-6l87f" (UID: "d0738358-399f-4f84-8552-0728eba20372") : secret "dns-default-metrics-tls" not found Apr 16 17:42:40.843510 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:42:40.843382 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert podName:ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1 nodeName:}" failed. No retries permitted until 2026-04-16 17:43:44.84336662 +0000 UTC m=+161.351756760 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-46ktl" (UID: "ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1") : secret "networking-console-plugin-cert" not found Apr 16 17:43:13.882119 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:13.882078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs\") pod \"network-metrics-daemon-gg8gs\" (UID: \"eccdd8a8-ee59-4c3c-852e-f012ce698554\") " pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:43:13.882704 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:13.882233 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 17:43:13.882704 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:13.882325 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs podName:eccdd8a8-ee59-4c3c-852e-f012ce698554 nodeName:}" failed. No retries permitted until 2026-04-16 17:45:15.882303198 +0000 UTC m=+252.390693354 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs") pod "network-metrics-daemon-gg8gs" (UID: "eccdd8a8-ee59-4c3c-852e-f012ce698554") : secret "metrics-daemon-secret" not found Apr 16 17:43:33.163065 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:33.163036 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vrkbq_2e01f328-7d13-47e4-ba26-d47919ca94fb/dns-node-resolver/0.log" Apr 16 17:43:33.958011 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:33.957982 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pv5jg_8643560d-c751-40a2-a84e-fd9619f0a198/node-ca/0.log" Apr 16 17:43:39.853642 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:39.853598 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" podUID="1b98fec4-6489-4373-b88e-c49c1e82c443" Apr 16 17:43:39.868869 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:39.868844 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zgfj4" podUID="573f0e79-0a24-47b1-9570-15a67f037365" Apr 16 17:43:39.895106 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:39.895073 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6l87f" podUID="d0738358-399f-4f84-8552-0728eba20372" Apr 16 17:43:39.911253 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:39.911223 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" podUID="ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1" Apr 16 17:43:40.581712 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:40.581679 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:43:40.581881 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:40.581682 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6l87f" Apr 16 17:43:40.581881 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:40.581683 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:43:41.133211 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:41.133164 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-gg8gs" podUID="eccdd8a8-ee59-4c3c-852e-f012ce698554" Apr 16 17:43:44.408411 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.408375 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5p4pc"] Apr 16 17:43:44.411446 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.411423 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:44.414082 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.414056 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 17:43:44.414658 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.414634 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 17:43:44.415048 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.415032 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-48zkm\"" Apr 16 17:43:44.415408 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.415392 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 17:43:44.415620 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.415601 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 17:43:44.423920 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.423896 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5p4pc"] Apr 16 17:43:44.496717 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.496679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2v5p\" (UniqueName: \"kubernetes.io/projected/aeebceb2-b35a-4208-ae0d-f95a63aa4920-kube-api-access-t2v5p\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:44.496884 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.496743 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:44.496924 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.496880 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/aeebceb2-b35a-4208-ae0d-f95a63aa4920-crio-socket\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:44.496962 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.496923 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/aeebceb2-b35a-4208-ae0d-f95a63aa4920-data-volume\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:44.496962 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.496941 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/aeebceb2-b35a-4208-ae0d-f95a63aa4920-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:44.597855 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.597817 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/aeebceb2-b35a-4208-ae0d-f95a63aa4920-crio-socket\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:44.597855 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.597854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/aeebceb2-b35a-4208-ae0d-f95a63aa4920-data-volume\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:44.598082 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.597870 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/aeebceb2-b35a-4208-ae0d-f95a63aa4920-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:44.598082 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.597893 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2v5p\" (UniqueName: \"kubernetes.io/projected/aeebceb2-b35a-4208-ae0d-f95a63aa4920-kube-api-access-t2v5p\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:44.598082 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.597954 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/aeebceb2-b35a-4208-ae0d-f95a63aa4920-crio-socket\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:44.598082 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.598034 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:44.598228 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:44.598147 2577 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 17:43:44.598228 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.598175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/aeebceb2-b35a-4208-ae0d-f95a63aa4920-data-volume\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:44.598228 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:44.598200 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls podName:aeebceb2-b35a-4208-ae0d-f95a63aa4920 nodeName:}" failed. No retries permitted until 2026-04-16 17:43:45.098186754 +0000 UTC m=+161.606576897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5p4pc" (UID: "aeebceb2-b35a-4208-ae0d-f95a63aa4920") : secret "insights-runtime-extractor-tls" not found Apr 16 17:43:44.598370 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.598320 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/aeebceb2-b35a-4208-ae0d-f95a63aa4920-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:44.606910 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.606891 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2v5p\" (UniqueName: \"kubernetes.io/projected/aeebceb2-b35a-4208-ae0d-f95a63aa4920-kube-api-access-t2v5p\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:44.799924 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.799827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert\") pod \"ingress-canary-zgfj4\" (UID: \"573f0e79-0a24-47b1-9570-15a67f037365\") " pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:43:44.799924 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.799886 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls\") pod \"image-registry-767977f9c4-tqv6c\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:43:44.800126 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:44.799976 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:43:44.800126 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:44.799992 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:43:44.800126 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:44.800002 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-767977f9c4-tqv6c: secret "image-registry-tls" not found Apr 16 17:43:44.800126 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:44.800049 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls podName:1b98fec4-6489-4373-b88e-c49c1e82c443 nodeName:}" failed. No retries permitted until 2026-04-16 17:45:46.800036445 +0000 UTC m=+283.308426586 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls") pod "image-registry-767977f9c4-tqv6c" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443") : secret "image-registry-tls" not found Apr 16 17:43:44.800126 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:44.800061 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert podName:573f0e79-0a24-47b1-9570-15a67f037365 nodeName:}" failed. No retries permitted until 2026-04-16 17:45:46.800055309 +0000 UTC m=+283.308445449 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert") pod "ingress-canary-zgfj4" (UID: "573f0e79-0a24-47b1-9570-15a67f037365") : secret "canary-serving-cert" not found Apr 16 17:43:44.900520 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.900489 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:43:44.900673 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:44.900552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-46ktl\" (UID: \"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:43:44.900673 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:44.900641 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:43:44.900673 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:44.900653 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:43:44.900791 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:44.900691 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert podName:ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1 nodeName:}" failed. No retries permitted until 2026-04-16 17:45:46.90067829 +0000 UTC m=+283.409068431 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-46ktl" (UID: "ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1") : secret "networking-console-plugin-cert" not found Apr 16 17:43:44.900791 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:44.900715 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls podName:d0738358-399f-4f84-8552-0728eba20372 nodeName:}" failed. No retries permitted until 2026-04-16 17:45:46.900697306 +0000 UTC m=+283.409087451 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls") pod "dns-default-6l87f" (UID: "d0738358-399f-4f84-8552-0728eba20372") : secret "dns-default-metrics-tls" not found Apr 16 17:43:45.102176 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:45.102136 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:45.102365 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:45.102282 2577 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 17:43:45.102430 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:45.102366 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls podName:aeebceb2-b35a-4208-ae0d-f95a63aa4920 nodeName:}" failed. No retries permitted until 2026-04-16 17:43:46.102351156 +0000 UTC m=+162.610741317 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5p4pc" (UID: "aeebceb2-b35a-4208-ae0d-f95a63aa4920") : secret "insights-runtime-extractor-tls" not found Apr 16 17:43:46.111003 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:46.110946 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:46.111414 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:46.111105 2577 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 17:43:46.111414 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:46.111172 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls podName:aeebceb2-b35a-4208-ae0d-f95a63aa4920 nodeName:}" failed. No retries permitted until 2026-04-16 17:43:48.11115702 +0000 UTC m=+164.619547161 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5p4pc" (UID: "aeebceb2-b35a-4208-ae0d-f95a63aa4920") : secret "insights-runtime-extractor-tls" not found Apr 16 17:43:48.126077 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:48.126044 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:48.126475 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:48.126175 2577 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 17:43:48.126475 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:48.126230 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls podName:aeebceb2-b35a-4208-ae0d-f95a63aa4920 nodeName:}" failed. No retries permitted until 2026-04-16 17:43:52.126215856 +0000 UTC m=+168.634605996 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5p4pc" (UID: "aeebceb2-b35a-4208-ae0d-f95a63aa4920") : secret "insights-runtime-extractor-tls" not found Apr 16 17:43:52.161203 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:52.161160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:43:52.161631 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:52.161308 2577 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 17:43:52.161631 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:43:52.161406 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls podName:aeebceb2-b35a-4208-ae0d-f95a63aa4920 nodeName:}" failed. No retries permitted until 2026-04-16 17:44:00.16138925 +0000 UTC m=+176.669779390 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5p4pc" (UID: "aeebceb2-b35a-4208-ae0d-f95a63aa4920") : secret "insights-runtime-extractor-tls" not found Apr 16 17:43:54.118510 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:54.118478 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:43:55.117607 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:43:55.117554 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:44:00.226095 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:00.226063 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:44:00.228503 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:00.228484 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/aeebceb2-b35a-4208-ae0d-f95a63aa4920-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5p4pc\" (UID: \"aeebceb2-b35a-4208-ae0d-f95a63aa4920\") " pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:44:00.323713 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:00.323662 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5p4pc" Apr 16 17:44:00.440028 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:00.439999 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5p4pc"] Apr 16 17:44:00.443585 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:44:00.443555 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaeebceb2_b35a_4208_ae0d_f95a63aa4920.slice/crio-8f7283a1f61309b0608c185d304271e3d313a8e9b0ef7c3ebbcfdcef96027e91 WatchSource:0}: Error finding container 8f7283a1f61309b0608c185d304271e3d313a8e9b0ef7c3ebbcfdcef96027e91: Status 404 returned error can't find the container with id 8f7283a1f61309b0608c185d304271e3d313a8e9b0ef7c3ebbcfdcef96027e91 Apr 16 17:44:00.627881 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:00.627829 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5p4pc" event={"ID":"aeebceb2-b35a-4208-ae0d-f95a63aa4920","Type":"ContainerStarted","Data":"bb7b4ee19d2a0c34f328a3bff4b4aa54e1b9e9bdce42582d86f36e1c381b9376"} Apr 16 17:44:00.627881 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:00.627880 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5p4pc" event={"ID":"aeebceb2-b35a-4208-ae0d-f95a63aa4920","Type":"ContainerStarted","Data":"8f7283a1f61309b0608c185d304271e3d313a8e9b0ef7c3ebbcfdcef96027e91"} Apr 16 17:44:01.632370 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:01.632313 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5p4pc" event={"ID":"aeebceb2-b35a-4208-ae0d-f95a63aa4920","Type":"ContainerStarted","Data":"274b8f5439e33f110a638528accc6b57380b2bdd94592753933f6c20b8891039"} Apr 16 17:44:02.636739 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:02.636704 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5p4pc" event={"ID":"aeebceb2-b35a-4208-ae0d-f95a63aa4920","Type":"ContainerStarted","Data":"94bdf2e5694ebca19499db2aa05cd90fe4024737c73c3eaf7db3055a63edf18b"} Apr 16 17:44:02.667707 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:02.667656 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5p4pc" podStartSLOduration=16.665332694 podStartE2EDuration="18.667638209s" podCreationTimestamp="2026-04-16 17:43:44 +0000 UTC" firstStartedPulling="2026-04-16 17:44:00.499871747 +0000 UTC m=+177.008261887" lastFinishedPulling="2026-04-16 17:44:02.502177248 +0000 UTC m=+179.010567402" observedRunningTime="2026-04-16 17:44:02.667479146 +0000 UTC m=+179.175869309" watchObservedRunningTime="2026-04-16 17:44:02.667638209 +0000 UTC m=+179.176028372" Apr 16 17:44:12.440035 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.439997 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-c9jqz"] Apr 16 17:44:12.443236 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.443220 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.445611 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.445589 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gp5hh\"" Apr 16 17:44:12.445773 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.445591 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 17:44:12.445773 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.445724 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 17:44:12.446230 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.446216 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 17:44:12.446230 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.446223 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 17:44:12.446309 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.446241 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 17:44:12.446539 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.446526 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 17:44:12.525849 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.525808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/208f38dc-f3ce-4e79-b9b1-1106f65c0831-metrics-client-ca\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.525849 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.525851 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-tls\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.526088 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.525874 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-wtmp\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.526088 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.525957 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/208f38dc-f3ce-4e79-b9b1-1106f65c0831-root\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.526088 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.526026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-accelerators-collector-config\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.526088 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.526072 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/208f38dc-f3ce-4e79-b9b1-1106f65c0831-sys\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.526258 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.526096 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-textfile\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.526258 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.526149 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb4q4\" (UniqueName: \"kubernetes.io/projected/208f38dc-f3ce-4e79-b9b1-1106f65c0831-kube-api-access-hb4q4\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.526258 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.526212 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.626856 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.626815 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/208f38dc-f3ce-4e79-b9b1-1106f65c0831-root\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.627043 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.626873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-accelerators-collector-config\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.627043 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.626902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/208f38dc-f3ce-4e79-b9b1-1106f65c0831-sys\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.627043 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.626923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-textfile\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.627043 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.626941 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/208f38dc-f3ce-4e79-b9b1-1106f65c0831-root\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.627043 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.626968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb4q4\" (UniqueName: \"kubernetes.io/projected/208f38dc-f3ce-4e79-b9b1-1106f65c0831-kube-api-access-hb4q4\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.627043 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.627031 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/208f38dc-f3ce-4e79-b9b1-1106f65c0831-sys\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.627310 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.627040 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.627310 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.627136 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/208f38dc-f3ce-4e79-b9b1-1106f65c0831-metrics-client-ca\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.627310 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.627160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-tls\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.627310 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.627186 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-wtmp\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.627310 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:44:12.627293 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 17:44:12.627521 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.627359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-wtmp\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.627521 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:44:12.627381 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-tls podName:208f38dc-f3ce-4e79-b9b1-1106f65c0831 nodeName:}" failed. No retries permitted until 2026-04-16 17:44:13.127358761 +0000 UTC m=+189.635748915 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-tls") pod "node-exporter-c9jqz" (UID: "208f38dc-f3ce-4e79-b9b1-1106f65c0831") : secret "node-exporter-tls" not found Apr 16 17:44:12.627521 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.627389 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-textfile\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.627619 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.627582 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-accelerators-collector-config\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.627685 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.627668 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/208f38dc-f3ce-4e79-b9b1-1106f65c0831-metrics-client-ca\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.629942 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.629926 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:12.635286 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:12.635263 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb4q4\" (UniqueName: \"kubernetes.io/projected/208f38dc-f3ce-4e79-b9b1-1106f65c0831-kube-api-access-hb4q4\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:13.132343 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.132290 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-tls\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:13.134772 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.134748 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/208f38dc-f3ce-4e79-b9b1-1106f65c0831-node-exporter-tls\") pod \"node-exporter-c9jqz\" (UID: \"208f38dc-f3ce-4e79-b9b1-1106f65c0831\") " pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:13.352021 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.351969 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-c9jqz" Apr 16 17:44:13.360043 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:44:13.360017 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod208f38dc_f3ce_4e79_b9b1_1106f65c0831.slice/crio-cd22243c96de6eb117cbd4b0ab65aff434a433bafef852dafc58f7ef19143e1a WatchSource:0}: Error finding container cd22243c96de6eb117cbd4b0ab65aff434a433bafef852dafc58f7ef19143e1a: Status 404 returned error can't find the container with id cd22243c96de6eb117cbd4b0ab65aff434a433bafef852dafc58f7ef19143e1a Apr 16 17:44:13.491610 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.491535 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 17:44:13.496219 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.496203 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.498317 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.498280 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 17:44:13.498556 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.498378 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 17:44:13.498556 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.498382 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 17:44:13.498556 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.498383 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 17:44:13.498556 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.498430 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 17:44:13.498739 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.498580 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 17:44:13.498739 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.498617 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 17:44:13.498739 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.498643 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 17:44:13.498739 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.498650 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 17:44:13.499015 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.498999 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-484tf\"" Apr 16 17:44:13.511915 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.511891 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 17:44:13.636212 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.636172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqxjf\" (UniqueName: \"kubernetes.io/projected/227249aa-9164-4508-8973-1046e0ef27e6-kube-api-access-fqxjf\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.636402 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.636231 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.636402 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.636258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/227249aa-9164-4508-8973-1046e0ef27e6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.636402 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.636274 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.636402 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.636297 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/227249aa-9164-4508-8973-1046e0ef27e6-config-out\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.636402 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.636350 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.636402 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.636375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-config-volume\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.636702 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.636473 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/227249aa-9164-4508-8973-1046e0ef27e6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.636702 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.636511 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/227249aa-9164-4508-8973-1046e0ef27e6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.636702 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.636537 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/227249aa-9164-4508-8973-1046e0ef27e6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.636702 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.636578 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.636702 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.636608 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.636702 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.636651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-web-config\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.663737 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.663705 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c9jqz" event={"ID":"208f38dc-f3ce-4e79-b9b1-1106f65c0831","Type":"ContainerStarted","Data":"cd22243c96de6eb117cbd4b0ab65aff434a433bafef852dafc58f7ef19143e1a"} Apr 16 17:44:13.737780 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.737739 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-web-config\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.737955 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.737792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqxjf\" (UniqueName: \"kubernetes.io/projected/227249aa-9164-4508-8973-1046e0ef27e6-kube-api-access-fqxjf\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.737955 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.737838 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.737955 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.737869 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/227249aa-9164-4508-8973-1046e0ef27e6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.737955 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.737894 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.737955 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.737918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/227249aa-9164-4508-8973-1046e0ef27e6-config-out\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.738337 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.738306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.738490 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.738384 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-config-volume\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.738694 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.738657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/227249aa-9164-4508-8973-1046e0ef27e6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.738694 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.738689 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/227249aa-9164-4508-8973-1046e0ef27e6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.738883 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.738722 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/227249aa-9164-4508-8973-1046e0ef27e6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.738883 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.738757 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.738883 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.738789 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.739737 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.738896 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/227249aa-9164-4508-8973-1046e0ef27e6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.739737 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.739167 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/227249aa-9164-4508-8973-1046e0ef27e6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.740644 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.740599 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/227249aa-9164-4508-8973-1046e0ef27e6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.742044 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.741972 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/227249aa-9164-4508-8973-1046e0ef27e6-config-out\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.742160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.742139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/227249aa-9164-4508-8973-1046e0ef27e6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.742553 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.742528 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.743073 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.743033 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.743750 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.743728 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-web-config\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.744246 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.743980 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.744246 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.744022 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-config-volume\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.744246 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.744131 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.744480 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.744462 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.745830 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.745797 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqxjf\" (UniqueName: \"kubernetes.io/projected/227249aa-9164-4508-8973-1046e0ef27e6-kube-api-access-fqxjf\") pod \"alertmanager-main-0\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.805507 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.805474 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:44:13.950303 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:13.950259 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 17:44:14.040967 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:44:14.040899 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod227249aa_9164_4508_8973_1046e0ef27e6.slice/crio-dea2366d82cf42e38d4693570c2f729fdfef5cbbda7a5e7fe078e7a2e05ca722 WatchSource:0}: Error finding container dea2366d82cf42e38d4693570c2f729fdfef5cbbda7a5e7fe078e7a2e05ca722: Status 404 returned error can't find the container with id dea2366d82cf42e38d4693570c2f729fdfef5cbbda7a5e7fe078e7a2e05ca722 Apr 16 17:44:14.667609 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:14.667570 2577 generic.go:358] "Generic (PLEG): container finished" podID="208f38dc-f3ce-4e79-b9b1-1106f65c0831" containerID="41253386ff511981f17c7b1577148662f32320151535c25c70b5227a148c08f0" exitCode=0 Apr 16 17:44:14.668057 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:14.667636 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c9jqz" event={"ID":"208f38dc-f3ce-4e79-b9b1-1106f65c0831","Type":"ContainerDied","Data":"41253386ff511981f17c7b1577148662f32320151535c25c70b5227a148c08f0"} Apr 16 17:44:14.668690 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:14.668662 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"227249aa-9164-4508-8973-1046e0ef27e6","Type":"ContainerStarted","Data":"dea2366d82cf42e38d4693570c2f729fdfef5cbbda7a5e7fe078e7a2e05ca722"} Apr 16 17:44:15.676597 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:15.676561 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c9jqz" event={"ID":"208f38dc-f3ce-4e79-b9b1-1106f65c0831","Type":"ContainerStarted","Data":"450d8e292b7dc623ad706c0a2fac0bfa6278e187b11d476c7fea97b3f0ee9214"} Apr 16 17:44:15.676597 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:15.676598 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c9jqz" event={"ID":"208f38dc-f3ce-4e79-b9b1-1106f65c0831","Type":"ContainerStarted","Data":"aa4ccb1a27716af490415a7cfc52b152101f48243b6e54bd66bddc179767ee6d"} Apr 16 17:44:15.677920 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:15.677900 2577 generic.go:358] "Generic (PLEG): container finished" podID="227249aa-9164-4508-8973-1046e0ef27e6" containerID="1bf51c95512199d707eaded0075c646f93f2328a38df0dfca1e5cb490ab7b15e" exitCode=0 Apr 16 17:44:15.677983 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:15.677948 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"227249aa-9164-4508-8973-1046e0ef27e6","Type":"ContainerDied","Data":"1bf51c95512199d707eaded0075c646f93f2328a38df0dfca1e5cb490ab7b15e"} Apr 16 17:44:15.695081 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:15.695026 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-c9jqz" podStartSLOduration=2.9734565059999998 podStartE2EDuration="3.69501064s" podCreationTimestamp="2026-04-16 17:44:12 +0000 UTC" firstStartedPulling="2026-04-16 17:44:13.362437967 +0000 UTC m=+189.870828107" lastFinishedPulling="2026-04-16 17:44:14.083992089 +0000 UTC m=+190.592382241" observedRunningTime="2026-04-16 17:44:15.694279511 +0000 UTC m=+192.202669674" watchObservedRunningTime="2026-04-16 17:44:15.69501064 +0000 UTC m=+192.203400803" Apr 16 17:44:17.193539 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:17.193513 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-9m42m"] Apr 16 17:44:17.196711 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:17.196692 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9m42m" Apr 16 17:44:17.198724 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:17.198658 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 17:44:17.198724 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:17.198713 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-hx96r\"" Apr 16 17:44:17.204776 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:17.204744 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-9m42m"] Apr 16 17:44:17.373487 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:17.373445 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd1706f9-bd64-44a9-bb23-a64284d2567a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-9m42m\" (UID: \"cd1706f9-bd64-44a9-bb23-a64284d2567a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9m42m" Apr 16 17:44:17.474041 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:17.473949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd1706f9-bd64-44a9-bb23-a64284d2567a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-9m42m\" (UID: \"cd1706f9-bd64-44a9-bb23-a64284d2567a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9m42m" Apr 16 17:44:17.474180 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:44:17.474076 2577 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 17:44:17.474180 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:44:17.474137 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd1706f9-bd64-44a9-bb23-a64284d2567a-monitoring-plugin-cert podName:cd1706f9-bd64-44a9-bb23-a64284d2567a nodeName:}" failed. No retries permitted until 2026-04-16 17:44:17.974118821 +0000 UTC m=+194.482508971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/cd1706f9-bd64-44a9-bb23-a64284d2567a-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-9m42m" (UID: "cd1706f9-bd64-44a9-bb23-a64284d2567a") : secret "monitoring-plugin-cert" not found Apr 16 17:44:17.686238 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:17.686206 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"227249aa-9164-4508-8973-1046e0ef27e6","Type":"ContainerStarted","Data":"a8b22a77ec5f79a9d40378db7696028fdda56bc249b901126526ce648be88244"} Apr 16 17:44:17.686238 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:17.686240 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"227249aa-9164-4508-8973-1046e0ef27e6","Type":"ContainerStarted","Data":"8226e0c1378d12c2c7247c983d438ec57c3123c190c206009eb9bf439898c639"} Apr 16 17:44:17.686481 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:17.686252 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"227249aa-9164-4508-8973-1046e0ef27e6","Type":"ContainerStarted","Data":"8b86be79d3c813e32b5ffac0f354d383a6d82cf625d8b8d1ac801f0f22dd7e9b"} Apr 16 17:44:17.686481 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:17.686266 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"227249aa-9164-4508-8973-1046e0ef27e6","Type":"ContainerStarted","Data":"b4e0821179da64e532e8c5095698a3de2c1230723a758be48c25ae0532ab27e8"} Apr 16 17:44:17.686481 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:17.686278 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"227249aa-9164-4508-8973-1046e0ef27e6","Type":"ContainerStarted","Data":"3ac4e859e874d86b0e7994856584dd4e2b9c988027f7d8200180619a98014884"} Apr 16 17:44:17.978709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:17.978674 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd1706f9-bd64-44a9-bb23-a64284d2567a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-9m42m\" (UID: \"cd1706f9-bd64-44a9-bb23-a64284d2567a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9m42m" Apr 16 17:44:17.981157 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:17.981127 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd1706f9-bd64-44a9-bb23-a64284d2567a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-9m42m\" (UID: \"cd1706f9-bd64-44a9-bb23-a64284d2567a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9m42m" Apr 16 17:44:18.108978 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:18.108953 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9m42m" Apr 16 17:44:18.220347 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:18.220302 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-9m42m"] Apr 16 17:44:18.223985 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:44:18.223950 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd1706f9_bd64_44a9_bb23_a64284d2567a.slice/crio-1091e18d3ac5e7acba3a56fb8f5deed821ea81dc814c0db90927fd55f4149d1b WatchSource:0}: Error finding container 1091e18d3ac5e7acba3a56fb8f5deed821ea81dc814c0db90927fd55f4149d1b: Status 404 returned error can't find the container with id 1091e18d3ac5e7acba3a56fb8f5deed821ea81dc814c0db90927fd55f4149d1b Apr 16 17:44:18.695508 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:18.695471 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"227249aa-9164-4508-8973-1046e0ef27e6","Type":"ContainerStarted","Data":"707cef41c8fd05be1cf5a3fa2157ef22f167733a2e20a8065ffec5be50c6ec8c"} Apr 16 17:44:18.696565 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:18.696538 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9m42m" event={"ID":"cd1706f9-bd64-44a9-bb23-a64284d2567a","Type":"ContainerStarted","Data":"1091e18d3ac5e7acba3a56fb8f5deed821ea81dc814c0db90927fd55f4149d1b"} Apr 16 17:44:18.737408 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:18.737356 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.892644734 podStartE2EDuration="5.737320028s" podCreationTimestamp="2026-04-16 17:44:13 +0000 UTC" firstStartedPulling="2026-04-16 17:44:14.04291037 +0000 UTC m=+190.551300517" lastFinishedPulling="2026-04-16 17:44:17.887585653 +0000 UTC m=+194.395975811" observedRunningTime="2026-04-16 17:44:18.73076346 +0000 UTC m=+195.239153633" watchObservedRunningTime="2026-04-16 17:44:18.737320028 +0000 UTC m=+195.245710189" Apr 16 17:44:19.700299 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:19.700266 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9m42m" event={"ID":"cd1706f9-bd64-44a9-bb23-a64284d2567a","Type":"ContainerStarted","Data":"30f9107fefbf3f19e27a5357aed2183237a23d2ab95d7ce168b5f1e503f84ca5"} Apr 16 17:44:19.700655 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:19.700550 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9m42m" Apr 16 17:44:19.704995 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:19.704975 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9m42m" Apr 16 17:44:19.716716 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:19.716675 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9m42m" podStartSLOduration=1.638552647 podStartE2EDuration="2.716662363s" podCreationTimestamp="2026-04-16 17:44:17 +0000 UTC" firstStartedPulling="2026-04-16 17:44:18.226187758 +0000 UTC m=+194.734577901" lastFinishedPulling="2026-04-16 17:44:19.304297477 +0000 UTC m=+195.812687617" observedRunningTime="2026-04-16 17:44:19.716172112 +0000 UTC m=+196.224562274" watchObservedRunningTime="2026-04-16 17:44:19.716662363 +0000 UTC m=+196.225052579" Apr 16 17:44:28.036366 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.036306 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-767977f9c4-tqv6c"] Apr 16 17:44:28.036833 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:44:28.036531 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" podUID="1b98fec4-6489-4373-b88e-c49c1e82c443" Apr 16 17:44:28.721876 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.721843 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:44:28.725680 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.725659 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:44:28.768971 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.768943 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-certificates\") pod \"1b98fec4-6489-4373-b88e-c49c1e82c443\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " Apr 16 17:44:28.769135 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.768997 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b98fec4-6489-4373-b88e-c49c1e82c443-image-registry-private-configuration\") pod \"1b98fec4-6489-4373-b88e-c49c1e82c443\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " Apr 16 17:44:28.769135 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.769024 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-bound-sa-token\") pod \"1b98fec4-6489-4373-b88e-c49c1e82c443\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " Apr 16 17:44:28.769135 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.769060 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsxf6\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-kube-api-access-jsxf6\") pod \"1b98fec4-6489-4373-b88e-c49c1e82c443\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " Apr 16 17:44:28.769135 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.769104 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b98fec4-6489-4373-b88e-c49c1e82c443-trusted-ca\") pod \"1b98fec4-6489-4373-b88e-c49c1e82c443\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " Apr 16 17:44:28.769135 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.769126 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b98fec4-6489-4373-b88e-c49c1e82c443-installation-pull-secrets\") pod \"1b98fec4-6489-4373-b88e-c49c1e82c443\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " Apr 16 17:44:28.769463 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.769171 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b98fec4-6489-4373-b88e-c49c1e82c443-ca-trust-extracted\") pod \"1b98fec4-6489-4373-b88e-c49c1e82c443\" (UID: \"1b98fec4-6489-4373-b88e-c49c1e82c443\") " Apr 16 17:44:28.769463 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.769395 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1b98fec4-6489-4373-b88e-c49c1e82c443" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:44:28.769567 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.769525 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b98fec4-6489-4373-b88e-c49c1e82c443-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1b98fec4-6489-4373-b88e-c49c1e82c443" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:44:28.769654 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.769635 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b98fec4-6489-4373-b88e-c49c1e82c443-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1b98fec4-6489-4373-b88e-c49c1e82c443" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:44:28.771646 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.771617 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b98fec4-6489-4373-b88e-c49c1e82c443-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "1b98fec4-6489-4373-b88e-c49c1e82c443" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:44:28.771746 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.771662 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1b98fec4-6489-4373-b88e-c49c1e82c443" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:44:28.771818 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.771792 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-kube-api-access-jsxf6" (OuterVolumeSpecName: "kube-api-access-jsxf6") pod "1b98fec4-6489-4373-b88e-c49c1e82c443" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443"). InnerVolumeSpecName "kube-api-access-jsxf6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:44:28.771863 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.771818 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b98fec4-6489-4373-b88e-c49c1e82c443-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1b98fec4-6489-4373-b88e-c49c1e82c443" (UID: "1b98fec4-6489-4373-b88e-c49c1e82c443"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:44:28.870149 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.870119 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b98fec4-6489-4373-b88e-c49c1e82c443-trusted-ca\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:44:28.870149 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.870142 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b98fec4-6489-4373-b88e-c49c1e82c443-installation-pull-secrets\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:44:28.870149 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.870152 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b98fec4-6489-4373-b88e-c49c1e82c443-ca-trust-extracted\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:44:28.870412 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.870162 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-certificates\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:44:28.870412 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.870171 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b98fec4-6489-4373-b88e-c49c1e82c443-image-registry-private-configuration\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:44:28.870412 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.870180 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-bound-sa-token\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:44:28.870412 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:28.870189 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jsxf6\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-kube-api-access-jsxf6\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:44:29.724013 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:29.723982 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-767977f9c4-tqv6c" Apr 16 17:44:29.760578 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:29.760526 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-767977f9c4-tqv6c"] Apr 16 17:44:29.765956 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:29.765935 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-767977f9c4-tqv6c"] Apr 16 17:44:29.878395 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:29.878364 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b98fec4-6489-4373-b88e-c49c1e82c443-registry-tls\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:44:30.121311 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:30.121278 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b98fec4-6489-4373-b88e-c49c1e82c443" path="/var/lib/kubelet/pods/1b98fec4-6489-4373-b88e-c49c1e82c443/volumes" Apr 16 17:44:31.206138 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.206103 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6df48d546f-5d9sh"] Apr 16 17:44:31.210721 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.210698 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.213645 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.213622 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 17:44:31.214174 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.213886 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 17:44:31.214174 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.213912 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 17:44:31.214174 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.213953 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 17:44:31.214174 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.213884 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 17:44:31.214174 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.213913 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-ghc5z\"" Apr 16 17:44:31.214174 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.213884 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 17:44:31.214174 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.213888 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 17:44:31.219234 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.219214 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6df48d546f-5d9sh"] Apr 16 17:44:31.289340 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.289310 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d27343c8-141c-4a3b-912a-900249a065cf-console-oauth-config\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.289501 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.289386 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d27343c8-141c-4a3b-912a-900249a065cf-console-serving-cert\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.289501 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.289413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-oauth-serving-cert\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.289501 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.289430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwt7n\" (UniqueName: \"kubernetes.io/projected/d27343c8-141c-4a3b-912a-900249a065cf-kube-api-access-kwt7n\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.289501 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.289453 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-service-ca\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.289629 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.289562 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-console-config\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.390771 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.390737 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d27343c8-141c-4a3b-912a-900249a065cf-console-serving-cert\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.390964 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.390801 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-oauth-serving-cert\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.390964 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.390830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwt7n\" (UniqueName: \"kubernetes.io/projected/d27343c8-141c-4a3b-912a-900249a065cf-kube-api-access-kwt7n\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.390964 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.390870 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-service-ca\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.390964 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.390940 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-console-config\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.391164 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.390968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d27343c8-141c-4a3b-912a-900249a065cf-console-oauth-config\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.391639 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.391568 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-console-config\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.391639 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.391647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-service-ca\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.392083 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.392066 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-oauth-serving-cert\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.393240 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.393215 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d27343c8-141c-4a3b-912a-900249a065cf-console-serving-cert\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.393354 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.393298 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d27343c8-141c-4a3b-912a-900249a065cf-console-oauth-config\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.399435 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.399415 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwt7n\" (UniqueName: \"kubernetes.io/projected/d27343c8-141c-4a3b-912a-900249a065cf-kube-api-access-kwt7n\") pod \"console-6df48d546f-5d9sh\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.520621 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.520541 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:31.638890 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.638861 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6df48d546f-5d9sh"] Apr 16 17:44:31.642982 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:44:31.642956 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd27343c8_141c_4a3b_912a_900249a065cf.slice/crio-1e227b8360149cb2d22b85c8aaf07966804a7551573032b3888aabc1471c28c6 WatchSource:0}: Error finding container 1e227b8360149cb2d22b85c8aaf07966804a7551573032b3888aabc1471c28c6: Status 404 returned error can't find the container with id 1e227b8360149cb2d22b85c8aaf07966804a7551573032b3888aabc1471c28c6 Apr 16 17:44:31.730340 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:31.730306 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df48d546f-5d9sh" event={"ID":"d27343c8-141c-4a3b-912a-900249a065cf","Type":"ContainerStarted","Data":"1e227b8360149cb2d22b85c8aaf07966804a7551573032b3888aabc1471c28c6"} Apr 16 17:44:34.739732 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:34.739699 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df48d546f-5d9sh" event={"ID":"d27343c8-141c-4a3b-912a-900249a065cf","Type":"ContainerStarted","Data":"10711bb4a73633ac13f0786701af142a44a101f2f026b24b42a4b8e0118fd4fa"} Apr 16 17:44:34.760081 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:34.760030 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6df48d546f-5d9sh" podStartSLOduration=1.281903101 podStartE2EDuration="3.760017177s" podCreationTimestamp="2026-04-16 17:44:31 +0000 UTC" firstStartedPulling="2026-04-16 17:44:31.64472936 +0000 UTC m=+208.153119500" lastFinishedPulling="2026-04-16 17:44:34.122843434 +0000 UTC m=+210.631233576" observedRunningTime="2026-04-16 17:44:34.759701381 +0000 UTC m=+211.268091556" watchObservedRunningTime="2026-04-16 17:44:34.760017177 +0000 UTC m=+211.268407338" Apr 16 17:44:41.521634 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:41.521584 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:41.521634 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:41.521643 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:41.526267 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:41.526247 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:41.762498 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:41.762470 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:44:48.083666 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:48.083610 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" podUID="fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 17:44:51.765541 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:51.765509 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6df48d546f-5d9sh"] Apr 16 17:44:58.083745 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:44:58.083705 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" podUID="fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 17:45:08.083896 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:08.083852 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" podUID="fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 17:45:08.084297 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:08.083922 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" Apr 16 17:45:08.084407 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:08.084377 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"2f229045f2fc0b984d1877d12679008892a8e62c7c039ab15a9dc0c1a1cf76be"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 17:45:08.084447 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:08.084427 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" podUID="fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc" containerName="service-proxy" containerID="cri-o://2f229045f2fc0b984d1877d12679008892a8e62c7c039ab15a9dc0c1a1cf76be" gracePeriod=30 Apr 16 17:45:08.826665 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:08.826612 2577 generic.go:358] "Generic (PLEG): container finished" podID="fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc" containerID="2f229045f2fc0b984d1877d12679008892a8e62c7c039ab15a9dc0c1a1cf76be" exitCode=2 Apr 16 17:45:08.826827 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:08.826678 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" event={"ID":"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc","Type":"ContainerDied","Data":"2f229045f2fc0b984d1877d12679008892a8e62c7c039ab15a9dc0c1a1cf76be"} Apr 16 17:45:08.826827 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:08.826714 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5546c7f497-k2f7m" event={"ID":"fcf73ecf-b5ca-400b-b50d-cbfc375fe2fc","Type":"ContainerStarted","Data":"c4d911ce7e02f0070375bea35b4751c89a7a74d932d12f2f713bbf12e8e704f6"} Apr 16 17:45:15.946560 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:15.946471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs\") pod \"network-metrics-daemon-gg8gs\" (UID: \"eccdd8a8-ee59-4c3c-852e-f012ce698554\") " pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:45:15.948837 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:15.948812 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eccdd8a8-ee59-4c3c-852e-f012ce698554-metrics-certs\") pod \"network-metrics-daemon-gg8gs\" (UID: \"eccdd8a8-ee59-4c3c-852e-f012ce698554\") " pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:45:16.120568 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:16.120534 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8xx6l\"" Apr 16 17:45:16.129230 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:16.129199 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg8gs" Apr 16 17:45:16.266732 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:16.266705 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gg8gs"] Apr 16 17:45:16.784978 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:16.784920 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6df48d546f-5d9sh" podUID="d27343c8-141c-4a3b-912a-900249a065cf" containerName="console" containerID="cri-o://10711bb4a73633ac13f0786701af142a44a101f2f026b24b42a4b8e0118fd4fa" gracePeriod=15 Apr 16 17:45:16.847684 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:16.847644 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gg8gs" event={"ID":"eccdd8a8-ee59-4c3c-852e-f012ce698554","Type":"ContainerStarted","Data":"e11bda2f79ad80bb8bdd28adf05670aa97db32227f3763a27b7b32047f038af6"} Apr 16 17:45:17.234177 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.234156 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6df48d546f-5d9sh_d27343c8-141c-4a3b-912a-900249a065cf/console/0.log" Apr 16 17:45:17.234522 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.234213 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:45:17.356405 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.356375 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-oauth-serving-cert\") pod \"d27343c8-141c-4a3b-912a-900249a065cf\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " Apr 16 17:45:17.356545 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.356441 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-service-ca\") pod \"d27343c8-141c-4a3b-912a-900249a065cf\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " Apr 16 17:45:17.356545 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.356466 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwt7n\" (UniqueName: \"kubernetes.io/projected/d27343c8-141c-4a3b-912a-900249a065cf-kube-api-access-kwt7n\") pod \"d27343c8-141c-4a3b-912a-900249a065cf\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " Apr 16 17:45:17.356545 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.356512 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d27343c8-141c-4a3b-912a-900249a065cf-console-serving-cert\") pod \"d27343c8-141c-4a3b-912a-900249a065cf\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " Apr 16 17:45:17.356708 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.356549 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d27343c8-141c-4a3b-912a-900249a065cf-console-oauth-config\") pod \"d27343c8-141c-4a3b-912a-900249a065cf\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " Apr 16 17:45:17.356708 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.356572 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-console-config\") pod \"d27343c8-141c-4a3b-912a-900249a065cf\" (UID: \"d27343c8-141c-4a3b-912a-900249a065cf\") " Apr 16 17:45:17.356911 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.356758 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d27343c8-141c-4a3b-912a-900249a065cf" (UID: "d27343c8-141c-4a3b-912a-900249a065cf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:45:17.357153 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.357086 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-console-config" (OuterVolumeSpecName: "console-config") pod "d27343c8-141c-4a3b-912a-900249a065cf" (UID: "d27343c8-141c-4a3b-912a-900249a065cf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:45:17.357153 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.357143 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-service-ca" (OuterVolumeSpecName: "service-ca") pod "d27343c8-141c-4a3b-912a-900249a065cf" (UID: "d27343c8-141c-4a3b-912a-900249a065cf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:45:17.359186 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.359148 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d27343c8-141c-4a3b-912a-900249a065cf-kube-api-access-kwt7n" (OuterVolumeSpecName: "kube-api-access-kwt7n") pod "d27343c8-141c-4a3b-912a-900249a065cf" (UID: "d27343c8-141c-4a3b-912a-900249a065cf"). InnerVolumeSpecName "kube-api-access-kwt7n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:45:17.359289 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.359192 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d27343c8-141c-4a3b-912a-900249a065cf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d27343c8-141c-4a3b-912a-900249a065cf" (UID: "d27343c8-141c-4a3b-912a-900249a065cf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:17.359289 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.359265 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d27343c8-141c-4a3b-912a-900249a065cf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d27343c8-141c-4a3b-912a-900249a065cf" (UID: "d27343c8-141c-4a3b-912a-900249a065cf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:17.457348 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.457308 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-service-ca\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:17.457348 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.457344 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kwt7n\" (UniqueName: \"kubernetes.io/projected/d27343c8-141c-4a3b-912a-900249a065cf-kube-api-access-kwt7n\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:17.457537 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.457355 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d27343c8-141c-4a3b-912a-900249a065cf-console-serving-cert\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:17.457537 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.457366 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d27343c8-141c-4a3b-912a-900249a065cf-console-oauth-config\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:17.457537 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.457374 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-console-config\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:17.457537 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.457384 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d27343c8-141c-4a3b-912a-900249a065cf-oauth-serving-cert\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:17.854118 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.854092 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6df48d546f-5d9sh_d27343c8-141c-4a3b-912a-900249a065cf/console/0.log" Apr 16 17:45:17.854288 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.854131 2577 generic.go:358] "Generic (PLEG): container finished" podID="d27343c8-141c-4a3b-912a-900249a065cf" containerID="10711bb4a73633ac13f0786701af142a44a101f2f026b24b42a4b8e0118fd4fa" exitCode=2 Apr 16 17:45:17.854288 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.854167 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df48d546f-5d9sh" event={"ID":"d27343c8-141c-4a3b-912a-900249a065cf","Type":"ContainerDied","Data":"10711bb4a73633ac13f0786701af142a44a101f2f026b24b42a4b8e0118fd4fa"} Apr 16 17:45:17.854288 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.854197 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df48d546f-5d9sh" Apr 16 17:45:17.854288 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.854212 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df48d546f-5d9sh" event={"ID":"d27343c8-141c-4a3b-912a-900249a065cf","Type":"ContainerDied","Data":"1e227b8360149cb2d22b85c8aaf07966804a7551573032b3888aabc1471c28c6"} Apr 16 17:45:17.854288 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.854234 2577 scope.go:117] "RemoveContainer" containerID="10711bb4a73633ac13f0786701af142a44a101f2f026b24b42a4b8e0118fd4fa" Apr 16 17:45:17.855837 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.855806 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gg8gs" event={"ID":"eccdd8a8-ee59-4c3c-852e-f012ce698554","Type":"ContainerStarted","Data":"d49fd972614b2f1c8d83457c76c9da47a4eeac3eb83188037a13a952dbb490fc"} Apr 16 17:45:17.855837 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.855834 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gg8gs" event={"ID":"eccdd8a8-ee59-4c3c-852e-f012ce698554","Type":"ContainerStarted","Data":"f4d8f796d446630d8107062a0a231da87056f93ec00d7503b3ade9b55c6c02fe"} Apr 16 17:45:17.862269 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.862256 2577 scope.go:117] "RemoveContainer" containerID="10711bb4a73633ac13f0786701af142a44a101f2f026b24b42a4b8e0118fd4fa" Apr 16 17:45:17.862558 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:45:17.862539 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10711bb4a73633ac13f0786701af142a44a101f2f026b24b42a4b8e0118fd4fa\": container with ID starting with 10711bb4a73633ac13f0786701af142a44a101f2f026b24b42a4b8e0118fd4fa not found: ID does not exist" containerID="10711bb4a73633ac13f0786701af142a44a101f2f026b24b42a4b8e0118fd4fa" Apr 16 17:45:17.862619 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.862566 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10711bb4a73633ac13f0786701af142a44a101f2f026b24b42a4b8e0118fd4fa"} err="failed to get container status \"10711bb4a73633ac13f0786701af142a44a101f2f026b24b42a4b8e0118fd4fa\": rpc error: code = NotFound desc = could not find container \"10711bb4a73633ac13f0786701af142a44a101f2f026b24b42a4b8e0118fd4fa\": container with ID starting with 10711bb4a73633ac13f0786701af142a44a101f2f026b24b42a4b8e0118fd4fa not found: ID does not exist" Apr 16 17:45:17.873899 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.873857 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gg8gs" podStartSLOduration=252.879593507 podStartE2EDuration="4m13.873846548s" podCreationTimestamp="2026-04-16 17:41:04 +0000 UTC" firstStartedPulling="2026-04-16 17:45:16.272718353 +0000 UTC m=+252.781108499" lastFinishedPulling="2026-04-16 17:45:17.266971384 +0000 UTC m=+253.775361540" observedRunningTime="2026-04-16 17:45:17.872747044 +0000 UTC m=+254.381137206" watchObservedRunningTime="2026-04-16 17:45:17.873846548 +0000 UTC m=+254.382236709" Apr 16 17:45:17.889594 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.889569 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6df48d546f-5d9sh"] Apr 16 17:45:17.894120 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:17.894101 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6df48d546f-5d9sh"] Apr 16 17:45:18.121183 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:18.121111 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d27343c8-141c-4a3b-912a-900249a065cf" path="/var/lib/kubelet/pods/d27343c8-141c-4a3b-912a-900249a065cf/volumes" Apr 16 17:45:29.530490 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.530456 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6978867857-bcqhn"] Apr 16 17:45:29.530913 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.530700 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d27343c8-141c-4a3b-912a-900249a065cf" containerName="console" Apr 16 17:45:29.530913 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.530710 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d27343c8-141c-4a3b-912a-900249a065cf" containerName="console" Apr 16 17:45:29.530913 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.530752 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d27343c8-141c-4a3b-912a-900249a065cf" containerName="console" Apr 16 17:45:29.534932 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.534914 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.536941 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.536921 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 17:45:29.537086 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.537068 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 17:45:29.537179 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.537163 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 17:45:29.537290 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.537275 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 17:45:29.537421 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.537402 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-ghc5z\"" Apr 16 17:45:29.537698 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.537673 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 17:45:29.537799 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.537707 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 17:45:29.537799 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.537711 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 17:45:29.542773 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.542751 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 17:45:29.544318 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.544297 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6978867857-bcqhn"] Apr 16 17:45:29.551772 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.551748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sw4f\" (UniqueName: \"kubernetes.io/projected/be977df4-a82e-439f-a7ac-81d911856465-kube-api-access-7sw4f\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.551875 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.551785 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-trusted-ca-bundle\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.551875 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.551838 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be977df4-a82e-439f-a7ac-81d911856465-console-oauth-config\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.551988 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.551901 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be977df4-a82e-439f-a7ac-81d911856465-console-serving-cert\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.551988 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.551969 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-console-config\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.552088 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.552009 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-service-ca\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.552088 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.552038 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-oauth-serving-cert\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.652848 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.652800 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be977df4-a82e-439f-a7ac-81d911856465-console-oauth-config\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.652848 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.652853 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be977df4-a82e-439f-a7ac-81d911856465-console-serving-cert\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.653095 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.652891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-console-config\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.653095 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.652915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-service-ca\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.653095 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.652936 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-oauth-serving-cert\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.653095 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.652958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sw4f\" (UniqueName: \"kubernetes.io/projected/be977df4-a82e-439f-a7ac-81d911856465-kube-api-access-7sw4f\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.653095 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.652990 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-trusted-ca-bundle\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.653754 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.653726 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-service-ca\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.653854 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.653775 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-console-config\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.653854 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.653806 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-oauth-serving-cert\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.653925 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.653861 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-trusted-ca-bundle\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.656087 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.656061 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be977df4-a82e-439f-a7ac-81d911856465-console-oauth-config\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.656184 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.656136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be977df4-a82e-439f-a7ac-81d911856465-console-serving-cert\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.660225 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.660202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sw4f\" (UniqueName: \"kubernetes.io/projected/be977df4-a82e-439f-a7ac-81d911856465-kube-api-access-7sw4f\") pod \"console-6978867857-bcqhn\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.846486 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.846445 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:29.964904 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:29.964867 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6978867857-bcqhn"] Apr 16 17:45:29.968083 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:45:29.968054 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe977df4_a82e_439f_a7ac_81d911856465.slice/crio-7600de6e13c09609853c2e461f256c1f22a617f267f3b35a6b58e483857eb140 WatchSource:0}: Error finding container 7600de6e13c09609853c2e461f256c1f22a617f267f3b35a6b58e483857eb140: Status 404 returned error can't find the container with id 7600de6e13c09609853c2e461f256c1f22a617f267f3b35a6b58e483857eb140 Apr 16 17:45:30.891504 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:30.891470 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6978867857-bcqhn" event={"ID":"be977df4-a82e-439f-a7ac-81d911856465","Type":"ContainerStarted","Data":"732bf8b1dd28d7564d0bb9a5d0601f04ca849782d8e92331b3986c6a719c01c2"} Apr 16 17:45:30.891504 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:30.891502 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6978867857-bcqhn" event={"ID":"be977df4-a82e-439f-a7ac-81d911856465","Type":"ContainerStarted","Data":"7600de6e13c09609853c2e461f256c1f22a617f267f3b35a6b58e483857eb140"} Apr 16 17:45:30.909774 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:30.909729 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6978867857-bcqhn" podStartSLOduration=1.9097168519999999 podStartE2EDuration="1.909716852s" podCreationTimestamp="2026-04-16 17:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:45:30.907994124 +0000 UTC m=+267.416384298" watchObservedRunningTime="2026-04-16 17:45:30.909716852 +0000 UTC m=+267.418107014" Apr 16 17:45:32.759163 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.759126 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 17:45:32.759632 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.759574 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="alertmanager" containerID="cri-o://3ac4e859e874d86b0e7994856584dd4e2b9c988027f7d8200180619a98014884" gracePeriod=120 Apr 16 17:45:32.759632 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.759597 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="kube-rbac-proxy-web" containerID="cri-o://8b86be79d3c813e32b5ffac0f354d383a6d82cf625d8b8d1ac801f0f22dd7e9b" gracePeriod=120 Apr 16 17:45:32.759632 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.759591 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="kube-rbac-proxy-metric" containerID="cri-o://a8b22a77ec5f79a9d40378db7696028fdda56bc249b901126526ce648be88244" gracePeriod=120 Apr 16 17:45:32.759835 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.759618 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="config-reloader" containerID="cri-o://b4e0821179da64e532e8c5095698a3de2c1230723a758be48c25ae0532ab27e8" gracePeriod=120 Apr 16 17:45:32.759835 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.759630 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="kube-rbac-proxy" containerID="cri-o://8226e0c1378d12c2c7247c983d438ec57c3123c190c206009eb9bf439898c639" gracePeriod=120 Apr 16 17:45:32.759835 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.759703 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="prom-label-proxy" containerID="cri-o://707cef41c8fd05be1cf5a3fa2157ef22f167733a2e20a8065ffec5be50c6ec8c" gracePeriod=120 Apr 16 17:45:32.900051 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.900024 2577 generic.go:358] "Generic (PLEG): container finished" podID="227249aa-9164-4508-8973-1046e0ef27e6" containerID="707cef41c8fd05be1cf5a3fa2157ef22f167733a2e20a8065ffec5be50c6ec8c" exitCode=0 Apr 16 17:45:32.900051 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.900047 2577 generic.go:358] "Generic (PLEG): container finished" podID="227249aa-9164-4508-8973-1046e0ef27e6" containerID="a8b22a77ec5f79a9d40378db7696028fdda56bc249b901126526ce648be88244" exitCode=0 Apr 16 17:45:32.900051 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.900054 2577 generic.go:358] "Generic (PLEG): container finished" podID="227249aa-9164-4508-8973-1046e0ef27e6" containerID="8226e0c1378d12c2c7247c983d438ec57c3123c190c206009eb9bf439898c639" exitCode=0 Apr 16 17:45:32.900208 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.900059 2577 generic.go:358] "Generic (PLEG): container finished" podID="227249aa-9164-4508-8973-1046e0ef27e6" containerID="b4e0821179da64e532e8c5095698a3de2c1230723a758be48c25ae0532ab27e8" exitCode=0 Apr 16 17:45:32.900208 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.900064 2577 generic.go:358] "Generic (PLEG): container finished" podID="227249aa-9164-4508-8973-1046e0ef27e6" containerID="3ac4e859e874d86b0e7994856584dd4e2b9c988027f7d8200180619a98014884" exitCode=0 Apr 16 17:45:32.900208 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.900088 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"227249aa-9164-4508-8973-1046e0ef27e6","Type":"ContainerDied","Data":"707cef41c8fd05be1cf5a3fa2157ef22f167733a2e20a8065ffec5be50c6ec8c"} Apr 16 17:45:32.900208 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.900110 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"227249aa-9164-4508-8973-1046e0ef27e6","Type":"ContainerDied","Data":"a8b22a77ec5f79a9d40378db7696028fdda56bc249b901126526ce648be88244"} Apr 16 17:45:32.900208 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.900118 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"227249aa-9164-4508-8973-1046e0ef27e6","Type":"ContainerDied","Data":"8226e0c1378d12c2c7247c983d438ec57c3123c190c206009eb9bf439898c639"} Apr 16 17:45:32.900208 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.900126 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"227249aa-9164-4508-8973-1046e0ef27e6","Type":"ContainerDied","Data":"b4e0821179da64e532e8c5095698a3de2c1230723a758be48c25ae0532ab27e8"} Apr 16 17:45:32.900208 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:32.900135 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"227249aa-9164-4508-8973-1046e0ef27e6","Type":"ContainerDied","Data":"3ac4e859e874d86b0e7994856584dd4e2b9c988027f7d8200180619a98014884"} Apr 16 17:45:33.905967 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:33.905935 2577 generic.go:358] "Generic (PLEG): container finished" podID="227249aa-9164-4508-8973-1046e0ef27e6" containerID="8b86be79d3c813e32b5ffac0f354d383a6d82cf625d8b8d1ac801f0f22dd7e9b" exitCode=0 Apr 16 17:45:33.906290 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:33.905978 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"227249aa-9164-4508-8973-1046e0ef27e6","Type":"ContainerDied","Data":"8b86be79d3c813e32b5ffac0f354d383a6d82cf625d8b8d1ac801f0f22dd7e9b"} Apr 16 17:45:33.994514 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:33.994490 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:34.090834 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.090761 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-cluster-tls-config\") pod \"227249aa-9164-4508-8973-1046e0ef27e6\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " Apr 16 17:45:34.090834 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.090803 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/227249aa-9164-4508-8973-1046e0ef27e6-alertmanager-trusted-ca-bundle\") pod \"227249aa-9164-4508-8973-1046e0ef27e6\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " Apr 16 17:45:34.090834 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.090825 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-main-tls\") pod \"227249aa-9164-4508-8973-1046e0ef27e6\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " Apr 16 17:45:34.091091 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.090852 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"227249aa-9164-4508-8973-1046e0ef27e6\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " Apr 16 17:45:34.091091 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.090885 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy-web\") pod \"227249aa-9164-4508-8973-1046e0ef27e6\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " Apr 16 17:45:34.091091 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.090938 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/227249aa-9164-4508-8973-1046e0ef27e6-alertmanager-main-db\") pod \"227249aa-9164-4508-8973-1046e0ef27e6\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " Apr 16 17:45:34.091091 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.090967 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-web-config\") pod \"227249aa-9164-4508-8973-1046e0ef27e6\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " Apr 16 17:45:34.091091 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.091012 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/227249aa-9164-4508-8973-1046e0ef27e6-tls-assets\") pod \"227249aa-9164-4508-8973-1046e0ef27e6\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " Apr 16 17:45:34.091364 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.091238 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/227249aa-9164-4508-8973-1046e0ef27e6-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "227249aa-9164-4508-8973-1046e0ef27e6" (UID: "227249aa-9164-4508-8973-1046e0ef27e6"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:45:34.091364 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.091258 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/227249aa-9164-4508-8973-1046e0ef27e6-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "227249aa-9164-4508-8973-1046e0ef27e6" (UID: "227249aa-9164-4508-8973-1046e0ef27e6"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:45:34.091364 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.091271 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/227249aa-9164-4508-8973-1046e0ef27e6-config-out\") pod \"227249aa-9164-4508-8973-1046e0ef27e6\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " Apr 16 17:45:34.091364 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.091304 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/227249aa-9164-4508-8973-1046e0ef27e6-metrics-client-ca\") pod \"227249aa-9164-4508-8973-1046e0ef27e6\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " Apr 16 17:45:34.091713 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.091372 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqxjf\" (UniqueName: \"kubernetes.io/projected/227249aa-9164-4508-8973-1046e0ef27e6-kube-api-access-fqxjf\") pod \"227249aa-9164-4508-8973-1046e0ef27e6\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " Apr 16 17:45:34.091713 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.091400 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy\") pod \"227249aa-9164-4508-8973-1046e0ef27e6\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " Apr 16 17:45:34.091713 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.091425 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-config-volume\") pod \"227249aa-9164-4508-8973-1046e0ef27e6\" (UID: \"227249aa-9164-4508-8973-1046e0ef27e6\") " Apr 16 17:45:34.091864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.091821 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/227249aa-9164-4508-8973-1046e0ef27e6-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:34.091864 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.091840 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/227249aa-9164-4508-8973-1046e0ef27e6-alertmanager-main-db\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:34.093733 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.093698 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "227249aa-9164-4508-8973-1046e0ef27e6" (UID: "227249aa-9164-4508-8973-1046e0ef27e6"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:34.094230 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.093953 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "227249aa-9164-4508-8973-1046e0ef27e6" (UID: "227249aa-9164-4508-8973-1046e0ef27e6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:34.094230 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.093980 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "227249aa-9164-4508-8973-1046e0ef27e6" (UID: "227249aa-9164-4508-8973-1046e0ef27e6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:34.094400 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.094358 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/227249aa-9164-4508-8973-1046e0ef27e6-config-out" (OuterVolumeSpecName: "config-out") pod "227249aa-9164-4508-8973-1046e0ef27e6" (UID: "227249aa-9164-4508-8973-1046e0ef27e6"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:45:34.094470 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.094451 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/227249aa-9164-4508-8973-1046e0ef27e6-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "227249aa-9164-4508-8973-1046e0ef27e6" (UID: "227249aa-9164-4508-8973-1046e0ef27e6"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:45:34.094678 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.094652 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/227249aa-9164-4508-8973-1046e0ef27e6-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "227249aa-9164-4508-8973-1046e0ef27e6" (UID: "227249aa-9164-4508-8973-1046e0ef27e6"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:45:34.094800 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.094783 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-config-volume" (OuterVolumeSpecName: "config-volume") pod "227249aa-9164-4508-8973-1046e0ef27e6" (UID: "227249aa-9164-4508-8973-1046e0ef27e6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:34.095576 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.095550 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/227249aa-9164-4508-8973-1046e0ef27e6-kube-api-access-fqxjf" (OuterVolumeSpecName: "kube-api-access-fqxjf") pod "227249aa-9164-4508-8973-1046e0ef27e6" (UID: "227249aa-9164-4508-8973-1046e0ef27e6"). InnerVolumeSpecName "kube-api-access-fqxjf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:45:34.096382 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.096362 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "227249aa-9164-4508-8973-1046e0ef27e6" (UID: "227249aa-9164-4508-8973-1046e0ef27e6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:34.097919 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.097897 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "227249aa-9164-4508-8973-1046e0ef27e6" (UID: "227249aa-9164-4508-8973-1046e0ef27e6"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:34.103557 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.103384 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-web-config" (OuterVolumeSpecName: "web-config") pod "227249aa-9164-4508-8973-1046e0ef27e6" (UID: "227249aa-9164-4508-8973-1046e0ef27e6"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:45:34.192138 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.192109 2577 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-cluster-tls-config\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:34.192138 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.192134 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-main-tls\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:34.192294 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.192145 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:34.192294 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.192155 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:34.192294 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.192164 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-web-config\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:34.192294 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.192173 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/227249aa-9164-4508-8973-1046e0ef27e6-tls-assets\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:34.192294 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.192180 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/227249aa-9164-4508-8973-1046e0ef27e6-config-out\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:34.192294 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.192187 2577 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/227249aa-9164-4508-8973-1046e0ef27e6-metrics-client-ca\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:34.192294 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.192195 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fqxjf\" (UniqueName: \"kubernetes.io/projected/227249aa-9164-4508-8973-1046e0ef27e6-kube-api-access-fqxjf\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:34.192294 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.192204 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:34.192294 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.192212 2577 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/227249aa-9164-4508-8973-1046e0ef27e6-config-volume\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:45:34.911726 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.911693 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"227249aa-9164-4508-8973-1046e0ef27e6","Type":"ContainerDied","Data":"dea2366d82cf42e38d4693570c2f729fdfef5cbbda7a5e7fe078e7a2e05ca722"} Apr 16 17:45:34.912094 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.911740 2577 scope.go:117] "RemoveContainer" containerID="707cef41c8fd05be1cf5a3fa2157ef22f167733a2e20a8065ffec5be50c6ec8c" Apr 16 17:45:34.912094 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.911748 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:34.918379 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.918360 2577 scope.go:117] "RemoveContainer" containerID="a8b22a77ec5f79a9d40378db7696028fdda56bc249b901126526ce648be88244" Apr 16 17:45:34.924692 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.924677 2577 scope.go:117] "RemoveContainer" containerID="8226e0c1378d12c2c7247c983d438ec57c3123c190c206009eb9bf439898c639" Apr 16 17:45:34.929781 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.929745 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 17:45:34.931182 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.931167 2577 scope.go:117] "RemoveContainer" containerID="8b86be79d3c813e32b5ffac0f354d383a6d82cf625d8b8d1ac801f0f22dd7e9b" Apr 16 17:45:34.935721 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.935703 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 17:45:34.937693 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.937667 2577 scope.go:117] "RemoveContainer" containerID="b4e0821179da64e532e8c5095698a3de2c1230723a758be48c25ae0532ab27e8" Apr 16 17:45:34.944044 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.944027 2577 scope.go:117] "RemoveContainer" containerID="3ac4e859e874d86b0e7994856584dd4e2b9c988027f7d8200180619a98014884" Apr 16 17:45:34.950030 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.950015 2577 scope.go:117] "RemoveContainer" containerID="1bf51c95512199d707eaded0075c646f93f2328a38df0dfca1e5cb490ab7b15e" Apr 16 17:45:34.964629 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964607 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 17:45:34.964830 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964818 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="kube-rbac-proxy-metric" Apr 16 17:45:34.964872 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964831 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="kube-rbac-proxy-metric" Apr 16 17:45:34.964872 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964840 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="kube-rbac-proxy-web" Apr 16 17:45:34.964872 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964845 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="kube-rbac-proxy-web" Apr 16 17:45:34.964872 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964854 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="init-config-reloader" Apr 16 17:45:34.964872 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964861 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="init-config-reloader" Apr 16 17:45:34.964872 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964869 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="alertmanager" Apr 16 17:45:34.964872 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964873 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="alertmanager" Apr 16 17:45:34.965063 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964881 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="kube-rbac-proxy" Apr 16 17:45:34.965063 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964890 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="kube-rbac-proxy" Apr 16 17:45:34.965063 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964899 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="config-reloader" Apr 16 17:45:34.965063 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964904 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="config-reloader" Apr 16 17:45:34.965063 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964909 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="prom-label-proxy" Apr 16 17:45:34.965063 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964914 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="prom-label-proxy" Apr 16 17:45:34.965063 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964951 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="prom-label-proxy" Apr 16 17:45:34.965063 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964960 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="alertmanager" Apr 16 17:45:34.965063 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964966 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="config-reloader" Apr 16 17:45:34.965063 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964972 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="kube-rbac-proxy-web" Apr 16 17:45:34.965063 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964977 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="kube-rbac-proxy" Apr 16 17:45:34.965063 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.964981 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="227249aa-9164-4508-8973-1046e0ef27e6" containerName="kube-rbac-proxy-metric" Apr 16 17:45:34.970143 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.970126 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:34.972199 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.972178 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 17:45:34.972345 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.972210 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 17:45:34.972345 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.972232 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 17:45:34.972345 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.972246 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 17:45:34.972345 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.972266 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-484tf\"" Apr 16 17:45:34.972345 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.972221 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 17:45:34.972345 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.972246 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 17:45:34.972635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.972586 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 17:45:34.972635 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.972604 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 17:45:34.977310 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.977289 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 17:45:34.985204 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.985184 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 17:45:34.999174 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.999150 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:34.999305 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.999184 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:34.999305 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.999200 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:34.999305 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.999242 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:34.999305 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.999278 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:34.999305 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.999299 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:34.999543 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.999317 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-config-volume\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:34.999543 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.999350 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-config-out\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:34.999543 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.999434 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-web-config\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:34.999543 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.999481 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:34.999543 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.999519 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:34.999724 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.999551 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:34.999724 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:34.999592 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbq2n\" (UniqueName: \"kubernetes.io/projected/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-kube-api-access-fbq2n\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.100899 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.100800 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.100899 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.100853 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.100899 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.100895 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.101172 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.100926 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbq2n\" (UniqueName: \"kubernetes.io/projected/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-kube-api-access-fbq2n\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.101172 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.100954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.101172 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.100985 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.101172 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.101008 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.101172 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.101035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.101172 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.101060 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.101172 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.101087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.101172 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.101117 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-config-volume\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.101172 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.101143 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-config-out\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.101629 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.101196 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-web-config\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.101629 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.101602 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.102256 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.102111 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.103258 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.103232 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.104224 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.104196 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-web-config\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.104370 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.104348 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.104447 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.104430 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.104794 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.104771 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.104794 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.104783 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.104794 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.104791 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.104995 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.104906 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.105172 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.105153 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-config-volume\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.105630 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.105612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-config-out\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.108582 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.108555 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbq2n\" (UniqueName: \"kubernetes.io/projected/d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258-kube-api-access-fbq2n\") pod \"alertmanager-main-0\" (UID: \"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.279717 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.279664 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:45:35.408505 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.408282 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 17:45:35.410915 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:45:35.410885 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd71fe6d1_b0f9_42ae_9c9d_3fd2d3e1a258.slice/crio-198913ad30a8a15d8328fbeaf8aaaec4af864d1a38c6c739b9ca3d5082759603 WatchSource:0}: Error finding container 198913ad30a8a15d8328fbeaf8aaaec4af864d1a38c6c739b9ca3d5082759603: Status 404 returned error can't find the container with id 198913ad30a8a15d8328fbeaf8aaaec4af864d1a38c6c739b9ca3d5082759603 Apr 16 17:45:35.916800 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.916757 2577 generic.go:358] "Generic (PLEG): container finished" podID="d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258" containerID="5cb05bda3b8ad6ad26332a6c6cb499fad612fbef213b2120843664961e69b2f1" exitCode=0 Apr 16 17:45:35.917145 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.916828 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258","Type":"ContainerDied","Data":"5cb05bda3b8ad6ad26332a6c6cb499fad612fbef213b2120843664961e69b2f1"} Apr 16 17:45:35.917145 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:35.916856 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258","Type":"ContainerStarted","Data":"198913ad30a8a15d8328fbeaf8aaaec4af864d1a38c6c739b9ca3d5082759603"} Apr 16 17:45:36.121890 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:36.121860 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="227249aa-9164-4508-8973-1046e0ef27e6" path="/var/lib/kubelet/pods/227249aa-9164-4508-8973-1046e0ef27e6/volumes" Apr 16 17:45:36.922344 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:36.922310 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258","Type":"ContainerStarted","Data":"0b9c93defa8e06303889b3c26c9baa8b2724c93e84372f3eeb39f668ab62c76c"} Apr 16 17:45:36.922723 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:36.922366 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258","Type":"ContainerStarted","Data":"e6b8fbf645858efde0eafaf1022b78c74d7de51a1042fbaac930739e6d2addd5"} Apr 16 17:45:36.922723 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:36.922376 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258","Type":"ContainerStarted","Data":"29711d063b284d35b9b4376ed5abce7d22a514c849a2a81c7911cb7e1b3f1f56"} Apr 16 17:45:36.922723 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:36.922383 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258","Type":"ContainerStarted","Data":"45262497bc051a9f7f044333a89e3643b20853342f5d4759d4bda0e5a0377d42"} Apr 16 17:45:36.922723 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:36.922391 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258","Type":"ContainerStarted","Data":"69a5a0a7ab95b777d495e88ef574ee5076b8352485e926bf6ff6af766316c71d"} Apr 16 17:45:36.922723 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:36.922398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258","Type":"ContainerStarted","Data":"8aa9a492246f6e5e22ec139ee9436db8fccc7d27d48177d0cef285cb721797d4"} Apr 16 17:45:36.949529 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:36.949487 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.949474271 podStartE2EDuration="2.949474271s" podCreationTimestamp="2026-04-16 17:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:45:36.9485248 +0000 UTC m=+273.456914961" watchObservedRunningTime="2026-04-16 17:45:36.949474271 +0000 UTC m=+273.457864494" Apr 16 17:45:39.846681 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:39.846629 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:39.847040 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:39.846713 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:39.851387 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:39.851364 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:39.934181 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:39.934155 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:45:43.583111 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:45:43.583056 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" podUID="ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1" Apr 16 17:45:43.583111 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:45:43.583055 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6l87f" podUID="d0738358-399f-4f84-8552-0728eba20372" Apr 16 17:45:43.583111 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:45:43.583083 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zgfj4" podUID="573f0e79-0a24-47b1-9570-15a67f037365" Apr 16 17:45:43.942685 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:43.942593 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:45:43.942685 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:43.942624 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:45:43.942685 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:43.942632 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6l87f" Apr 16 17:45:46.897463 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:46.897422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert\") pod \"ingress-canary-zgfj4\" (UID: \"573f0e79-0a24-47b1-9570-15a67f037365\") " pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:45:46.899939 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:46.899903 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573f0e79-0a24-47b1-9570-15a67f037365-cert\") pod \"ingress-canary-zgfj4\" (UID: \"573f0e79-0a24-47b1-9570-15a67f037365\") " pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:45:46.945760 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:46.945727 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w8phs\"" Apr 16 17:45:46.953833 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:46.953803 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zgfj4" Apr 16 17:45:46.999518 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:46.999027 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:45:46.999518 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:46.999256 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-46ktl\" (UID: \"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:45:47.001973 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:47.001947 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0738358-399f-4f84-8552-0728eba20372-metrics-tls\") pod \"dns-default-6l87f\" (UID: \"d0738358-399f-4f84-8552-0728eba20372\") " pod="openshift-dns/dns-default-6l87f" Apr 16 17:45:47.002519 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:47.002451 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-46ktl\" (UID: \"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:45:47.073067 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:47.073044 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zgfj4"] Apr 16 17:45:47.075520 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:45:47.075491 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573f0e79_0a24_47b1_9570_15a67f037365.slice/crio-563daae1eb75aa455666fe45de991443096a027c43577db22cca87e22221edd6 WatchSource:0}: Error finding container 563daae1eb75aa455666fe45de991443096a027c43577db22cca87e22221edd6: Status 404 returned error can't find the container with id 563daae1eb75aa455666fe45de991443096a027c43577db22cca87e22221edd6 Apr 16 17:45:47.245904 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:47.245825 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2vlxb\"" Apr 16 17:45:47.245904 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:47.245868 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-bfzrx\"" Apr 16 17:45:47.254272 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:47.254254 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" Apr 16 17:45:47.254272 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:47.254269 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6l87f" Apr 16 17:45:47.380977 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:47.380947 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl"] Apr 16 17:45:47.384097 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:45:47.384059 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab1c4c42_78b6_4a7e_aadc_83a1b22e89c1.slice/crio-93ec6990f55b026777b1abf81015e6a21e055278c45c13dfecabbaf4975cc59e WatchSource:0}: Error finding container 93ec6990f55b026777b1abf81015e6a21e055278c45c13dfecabbaf4975cc59e: Status 404 returned error can't find the container with id 93ec6990f55b026777b1abf81015e6a21e055278c45c13dfecabbaf4975cc59e Apr 16 17:45:47.401308 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:47.401280 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6l87f"] Apr 16 17:45:47.404141 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:45:47.404108 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0738358_399f_4f84_8552_0728eba20372.slice/crio-2be9af1c1b0988db1efcb27d1b3a3b6e19f37817660f3a5720f6fc740f2a1d62 WatchSource:0}: Error finding container 2be9af1c1b0988db1efcb27d1b3a3b6e19f37817660f3a5720f6fc740f2a1d62: Status 404 returned error can't find the container with id 2be9af1c1b0988db1efcb27d1b3a3b6e19f37817660f3a5720f6fc740f2a1d62 Apr 16 17:45:47.955089 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:47.955051 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" event={"ID":"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1","Type":"ContainerStarted","Data":"93ec6990f55b026777b1abf81015e6a21e055278c45c13dfecabbaf4975cc59e"} Apr 16 17:45:47.956626 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:47.956578 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zgfj4" event={"ID":"573f0e79-0a24-47b1-9570-15a67f037365","Type":"ContainerStarted","Data":"563daae1eb75aa455666fe45de991443096a027c43577db22cca87e22221edd6"} Apr 16 17:45:47.958326 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:47.958279 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6l87f" event={"ID":"d0738358-399f-4f84-8552-0728eba20372","Type":"ContainerStarted","Data":"2be9af1c1b0988db1efcb27d1b3a3b6e19f37817660f3a5720f6fc740f2a1d62"} Apr 16 17:45:49.964665 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:49.964577 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6l87f" event={"ID":"d0738358-399f-4f84-8552-0728eba20372","Type":"ContainerStarted","Data":"dfb852a433e3c5932d4a0a88ab299d03ab0e58e68c6283a6bc9d6848fa756af1"} Apr 16 17:45:49.964665 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:49.964615 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6l87f" event={"ID":"d0738358-399f-4f84-8552-0728eba20372","Type":"ContainerStarted","Data":"7a153dd3104272be4dd0b0f1e263f2dd7f17500faa79cf874d7785e759d8f5fa"} Apr 16 17:45:49.965080 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:49.964788 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6l87f" Apr 16 17:45:49.965882 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:49.965863 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" event={"ID":"ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1","Type":"ContainerStarted","Data":"5c9ad4f1c89f90d70c1f1854529bc4bea536b5b6d3001f7ae3a6f5afe2497802"} Apr 16 17:45:49.966995 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:49.966977 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zgfj4" event={"ID":"573f0e79-0a24-47b1-9570-15a67f037365","Type":"ContainerStarted","Data":"baec5b85f726933408745866759092381c7fb4587ad80c695566706670fc3108"} Apr 16 17:45:49.982293 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:49.982249 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6l87f" podStartSLOduration=252.020109621 podStartE2EDuration="4m13.982235891s" podCreationTimestamp="2026-04-16 17:41:36 +0000 UTC" firstStartedPulling="2026-04-16 17:45:47.4057862 +0000 UTC m=+283.914176342" lastFinishedPulling="2026-04-16 17:45:49.367912461 +0000 UTC m=+285.876302612" observedRunningTime="2026-04-16 17:45:49.98206443 +0000 UTC m=+286.490454622" watchObservedRunningTime="2026-04-16 17:45:49.982235891 +0000 UTC m=+286.490626053" Apr 16 17:45:49.997904 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:49.997853 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zgfj4" podStartSLOduration=251.70926812 podStartE2EDuration="4m13.997836026s" podCreationTimestamp="2026-04-16 17:41:36 +0000 UTC" firstStartedPulling="2026-04-16 17:45:47.077379146 +0000 UTC m=+283.585769290" lastFinishedPulling="2026-04-16 17:45:49.365947053 +0000 UTC m=+285.874337196" observedRunningTime="2026-04-16 17:45:49.997139209 +0000 UTC m=+286.505529389" watchObservedRunningTime="2026-04-16 17:45:49.997836026 +0000 UTC m=+286.506226186" Apr 16 17:45:50.013257 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:50.013185 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-46ktl" podStartSLOduration=257.034859866 podStartE2EDuration="4m19.013169569s" podCreationTimestamp="2026-04-16 17:41:31 +0000 UTC" firstStartedPulling="2026-04-16 17:45:47.385996638 +0000 UTC m=+283.894386778" lastFinishedPulling="2026-04-16 17:45:49.364306341 +0000 UTC m=+285.872696481" observedRunningTime="2026-04-16 17:45:50.012692715 +0000 UTC m=+286.521082876" watchObservedRunningTime="2026-04-16 17:45:50.013169569 +0000 UTC m=+286.521559733" Apr 16 17:45:59.972196 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:45:59.972169 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6l87f" Apr 16 17:46:04.010143 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:46:04.010118 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-acl-logging/0.log" Apr 16 17:46:04.010596 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:46:04.010140 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-acl-logging/0.log" Apr 16 17:46:04.015234 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:46:04.015213 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 17:46:59.911481 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:46:59.911401 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb"] Apr 16 17:46:59.914473 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:46:59.914449 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb" Apr 16 17:46:59.916572 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:46:59.916542 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 17:46:59.916681 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:46:59.916651 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 17:46:59.916847 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:46:59.916832 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-pjxfr\"" Apr 16 17:46:59.916927 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:46:59.916891 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 17:46:59.924877 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:46:59.924855 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb"] Apr 16 17:47:00.004661 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:47:00.004620 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjswt\" (UniqueName: \"kubernetes.io/projected/652d2361-bd2c-4c84-8804-b84676f272e4-kube-api-access-sjswt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb\" (UID: \"652d2361-bd2c-4c84-8804-b84676f272e4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb" Apr 16 17:47:00.004833 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:47:00.004679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/652d2361-bd2c-4c84-8804-b84676f272e4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb\" (UID: \"652d2361-bd2c-4c84-8804-b84676f272e4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb" Apr 16 17:47:00.105236 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:47:00.105195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjswt\" (UniqueName: \"kubernetes.io/projected/652d2361-bd2c-4c84-8804-b84676f272e4-kube-api-access-sjswt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb\" (UID: \"652d2361-bd2c-4c84-8804-b84676f272e4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb" Apr 16 17:47:00.105448 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:47:00.105246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/652d2361-bd2c-4c84-8804-b84676f272e4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb\" (UID: \"652d2361-bd2c-4c84-8804-b84676f272e4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb" Apr 16 17:47:00.107741 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:47:00.107705 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/652d2361-bd2c-4c84-8804-b84676f272e4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb\" (UID: \"652d2361-bd2c-4c84-8804-b84676f272e4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb" Apr 16 17:47:00.113543 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:47:00.113523 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjswt\" (UniqueName: \"kubernetes.io/projected/652d2361-bd2c-4c84-8804-b84676f272e4-kube-api-access-sjswt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb\" (UID: \"652d2361-bd2c-4c84-8804-b84676f272e4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb" Apr 16 17:47:00.224603 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:47:00.224512 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb" Apr 16 17:47:00.350195 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:47:00.347802 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb"] Apr 16 17:47:00.354896 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:47:00.354868 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod652d2361_bd2c_4c84_8804_b84676f272e4.slice/crio-b474d3d629ac9248e77ceb7759f793874232f8b2f1fb906bb9e68b998ff0f1a5 WatchSource:0}: Error finding container b474d3d629ac9248e77ceb7759f793874232f8b2f1fb906bb9e68b998ff0f1a5: Status 404 returned error can't find the container with id b474d3d629ac9248e77ceb7759f793874232f8b2f1fb906bb9e68b998ff0f1a5 Apr 16 17:47:00.356615 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:47:00.356599 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:47:01.157083 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:47:01.157037 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb" event={"ID":"652d2361-bd2c-4c84-8804-b84676f272e4","Type":"ContainerStarted","Data":"b474d3d629ac9248e77ceb7759f793874232f8b2f1fb906bb9e68b998ff0f1a5"} Apr 16 17:47:04.167383 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:47:04.167282 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb" event={"ID":"652d2361-bd2c-4c84-8804-b84676f272e4","Type":"ContainerStarted","Data":"a61b07d0e372a09623cd81dd191cbe1c6bd324ca9e904dcd9aff5985eb9de964"} Apr 16 17:47:04.167709 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:47:04.167405 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb" Apr 16 17:47:04.187267 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:47:04.187207 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb" podStartSLOduration=1.702726728 podStartE2EDuration="5.18719466s" podCreationTimestamp="2026-04-16 17:46:59 +0000 UTC" firstStartedPulling="2026-04-16 17:47:00.356715794 +0000 UTC m=+356.865105934" lastFinishedPulling="2026-04-16 17:47:03.841183724 +0000 UTC m=+360.349573866" observedRunningTime="2026-04-16 17:47:04.186007783 +0000 UTC m=+360.694397945" watchObservedRunningTime="2026-04-16 17:47:04.18719466 +0000 UTC m=+360.695584822" Apr 16 17:47:25.172266 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:47:25.172229 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2gsbb" Apr 16 17:48:19.110491 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.110417 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-smvq2"] Apr 16 17:48:19.113523 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.113507 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-smvq2" Apr 16 17:48:19.115488 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.115469 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 17:48:19.116039 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.116018 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 17:48:19.116122 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.116025 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-kxbwm\"" Apr 16 17:48:19.116122 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.116024 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 17:48:19.126046 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.126022 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-smvq2"] Apr 16 17:48:19.139191 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.139165 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-8hnfr"] Apr 16 17:48:19.142291 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.142270 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-8hnfr" Apr 16 17:48:19.144318 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.144294 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 17:48:19.144318 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.144309 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-j5pjh\"" Apr 16 17:48:19.154085 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.154061 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-8hnfr"] Apr 16 17:48:19.198714 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.198684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99krs\" (UniqueName: \"kubernetes.io/projected/3586c7d3-442f-4bca-b15e-f5ed0ab2d6aa-kube-api-access-99krs\") pod \"seaweedfs-86cc847c5c-8hnfr\" (UID: \"3586c7d3-442f-4bca-b15e-f5ed0ab2d6aa\") " pod="kserve/seaweedfs-86cc847c5c-8hnfr" Apr 16 17:48:19.198878 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.198772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3586c7d3-442f-4bca-b15e-f5ed0ab2d6aa-data\") pod \"seaweedfs-86cc847c5c-8hnfr\" (UID: \"3586c7d3-442f-4bca-b15e-f5ed0ab2d6aa\") " pod="kserve/seaweedfs-86cc847c5c-8hnfr" Apr 16 17:48:19.299631 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.299599 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3586c7d3-442f-4bca-b15e-f5ed0ab2d6aa-data\") pod \"seaweedfs-86cc847c5c-8hnfr\" (UID: \"3586c7d3-442f-4bca-b15e-f5ed0ab2d6aa\") " pod="kserve/seaweedfs-86cc847c5c-8hnfr" Apr 16 17:48:19.299781 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.299640 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99krs\" (UniqueName: \"kubernetes.io/projected/3586c7d3-442f-4bca-b15e-f5ed0ab2d6aa-kube-api-access-99krs\") pod \"seaweedfs-86cc847c5c-8hnfr\" (UID: \"3586c7d3-442f-4bca-b15e-f5ed0ab2d6aa\") " pod="kserve/seaweedfs-86cc847c5c-8hnfr" Apr 16 17:48:19.299781 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.299663 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cth2k\" (UniqueName: \"kubernetes.io/projected/38612e0f-1afe-426e-87a6-94b1b57fb86f-kube-api-access-cth2k\") pod \"llmisvc-controller-manager-68cc5db7c4-smvq2\" (UID: \"38612e0f-1afe-426e-87a6-94b1b57fb86f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-smvq2" Apr 16 17:48:19.299781 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.299685 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/38612e0f-1afe-426e-87a6-94b1b57fb86f-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-smvq2\" (UID: \"38612e0f-1afe-426e-87a6-94b1b57fb86f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-smvq2" Apr 16 17:48:19.299988 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.299968 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3586c7d3-442f-4bca-b15e-f5ed0ab2d6aa-data\") pod \"seaweedfs-86cc847c5c-8hnfr\" (UID: \"3586c7d3-442f-4bca-b15e-f5ed0ab2d6aa\") " pod="kserve/seaweedfs-86cc847c5c-8hnfr" Apr 16 17:48:19.307825 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.307796 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99krs\" (UniqueName: \"kubernetes.io/projected/3586c7d3-442f-4bca-b15e-f5ed0ab2d6aa-kube-api-access-99krs\") pod \"seaweedfs-86cc847c5c-8hnfr\" (UID: \"3586c7d3-442f-4bca-b15e-f5ed0ab2d6aa\") " pod="kserve/seaweedfs-86cc847c5c-8hnfr" Apr 16 17:48:19.400286 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.400198 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cth2k\" (UniqueName: \"kubernetes.io/projected/38612e0f-1afe-426e-87a6-94b1b57fb86f-kube-api-access-cth2k\") pod \"llmisvc-controller-manager-68cc5db7c4-smvq2\" (UID: \"38612e0f-1afe-426e-87a6-94b1b57fb86f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-smvq2" Apr 16 17:48:19.400286 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.400235 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/38612e0f-1afe-426e-87a6-94b1b57fb86f-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-smvq2\" (UID: \"38612e0f-1afe-426e-87a6-94b1b57fb86f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-smvq2" Apr 16 17:48:19.400486 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:48:19.400357 2577 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 17:48:19.400486 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:48:19.400417 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38612e0f-1afe-426e-87a6-94b1b57fb86f-cert podName:38612e0f-1afe-426e-87a6-94b1b57fb86f nodeName:}" failed. No retries permitted until 2026-04-16 17:48:19.900402019 +0000 UTC m=+436.408792160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/38612e0f-1afe-426e-87a6-94b1b57fb86f-cert") pod "llmisvc-controller-manager-68cc5db7c4-smvq2" (UID: "38612e0f-1afe-426e-87a6-94b1b57fb86f") : secret "llmisvc-webhook-server-cert" not found Apr 16 17:48:19.408631 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.408600 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cth2k\" (UniqueName: \"kubernetes.io/projected/38612e0f-1afe-426e-87a6-94b1b57fb86f-kube-api-access-cth2k\") pod \"llmisvc-controller-manager-68cc5db7c4-smvq2\" (UID: \"38612e0f-1afe-426e-87a6-94b1b57fb86f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-smvq2" Apr 16 17:48:19.451405 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.451370 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-8hnfr" Apr 16 17:48:19.574583 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.574536 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-8hnfr"] Apr 16 17:48:19.577942 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:48:19.577913 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3586c7d3_442f_4bca_b15e_f5ed0ab2d6aa.slice/crio-f434724c870bf7c07680fa40a5e056f3cf173627a55322f2c5a786f42cd99ff9 WatchSource:0}: Error finding container f434724c870bf7c07680fa40a5e056f3cf173627a55322f2c5a786f42cd99ff9: Status 404 returned error can't find the container with id f434724c870bf7c07680fa40a5e056f3cf173627a55322f2c5a786f42cd99ff9 Apr 16 17:48:19.903610 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.903569 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/38612e0f-1afe-426e-87a6-94b1b57fb86f-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-smvq2\" (UID: \"38612e0f-1afe-426e-87a6-94b1b57fb86f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-smvq2" Apr 16 17:48:19.906085 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:19.906065 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/38612e0f-1afe-426e-87a6-94b1b57fb86f-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-smvq2\" (UID: \"38612e0f-1afe-426e-87a6-94b1b57fb86f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-smvq2" Apr 16 17:48:20.022745 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:20.022660 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-smvq2" Apr 16 17:48:20.253642 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:20.253610 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-smvq2"] Apr 16 17:48:20.257275 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:48:20.257249 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod38612e0f_1afe_426e_87a6_94b1b57fb86f.slice/crio-dee52039a91c7b7759adb9248e1ec0a16450b776ec920ee62426efac637f9361 WatchSource:0}: Error finding container dee52039a91c7b7759adb9248e1ec0a16450b776ec920ee62426efac637f9361: Status 404 returned error can't find the container with id dee52039a91c7b7759adb9248e1ec0a16450b776ec920ee62426efac637f9361 Apr 16 17:48:20.371160 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:20.371124 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-smvq2" event={"ID":"38612e0f-1afe-426e-87a6-94b1b57fb86f","Type":"ContainerStarted","Data":"dee52039a91c7b7759adb9248e1ec0a16450b776ec920ee62426efac637f9361"} Apr 16 17:48:20.372214 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:20.372181 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-8hnfr" event={"ID":"3586c7d3-442f-4bca-b15e-f5ed0ab2d6aa","Type":"ContainerStarted","Data":"f434724c870bf7c07680fa40a5e056f3cf173627a55322f2c5a786f42cd99ff9"} Apr 16 17:48:23.384318 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:23.384284 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-smvq2" event={"ID":"38612e0f-1afe-426e-87a6-94b1b57fb86f","Type":"ContainerStarted","Data":"e4b78aa2f64243aaef21a1fd5af82efb860304107744f1a28f68b2cc95223377"} Apr 16 17:48:23.384787 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:23.384393 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-smvq2" Apr 16 17:48:23.385563 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:23.385543 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-8hnfr" event={"ID":"3586c7d3-442f-4bca-b15e-f5ed0ab2d6aa","Type":"ContainerStarted","Data":"3ed4d9f34915151f6d80e8efb88da1a92fb5c6e9f1377f6f37fafba60114d036"} Apr 16 17:48:23.385658 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:23.385647 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-8hnfr" Apr 16 17:48:23.404250 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:23.404191 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-smvq2" podStartSLOduration=1.6858673419999999 podStartE2EDuration="4.404179756s" podCreationTimestamp="2026-04-16 17:48:19 +0000 UTC" firstStartedPulling="2026-04-16 17:48:20.258964335 +0000 UTC m=+436.767354481" lastFinishedPulling="2026-04-16 17:48:22.977276752 +0000 UTC m=+439.485666895" observedRunningTime="2026-04-16 17:48:23.403646157 +0000 UTC m=+439.912036342" watchObservedRunningTime="2026-04-16 17:48:23.404179756 +0000 UTC m=+439.912569918" Apr 16 17:48:23.424597 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:23.424548 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-8hnfr" podStartSLOduration=0.973673382 podStartE2EDuration="4.424532955s" podCreationTimestamp="2026-04-16 17:48:19 +0000 UTC" firstStartedPulling="2026-04-16 17:48:19.579104132 +0000 UTC m=+436.087494272" lastFinishedPulling="2026-04-16 17:48:23.029963705 +0000 UTC m=+439.538353845" observedRunningTime="2026-04-16 17:48:23.42286044 +0000 UTC m=+439.931250601" watchObservedRunningTime="2026-04-16 17:48:23.424532955 +0000 UTC m=+439.932923117" Apr 16 17:48:29.390670 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:29.390640 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-8hnfr" Apr 16 17:48:54.389954 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:54.389921 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-smvq2" Apr 16 17:48:55.727404 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:55.727371 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7f8f4564d-9rl7c"] Apr 16 17:48:55.730547 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:55.730530 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f8f4564d-9rl7c" Apr 16 17:48:55.733174 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:55.733157 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 17:48:55.733270 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:55.733200 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-zcdkd\"" Apr 16 17:48:55.739527 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:55.739508 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f8f4564d-9rl7c"] Apr 16 17:48:55.883166 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:55.883135 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30a0ede3-8065-4152-aafe-b11c955a7ba6-cert\") pod \"kserve-controller-manager-7f8f4564d-9rl7c\" (UID: \"30a0ede3-8065-4152-aafe-b11c955a7ba6\") " pod="kserve/kserve-controller-manager-7f8f4564d-9rl7c" Apr 16 17:48:55.883370 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:55.883192 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2trbr\" (UniqueName: \"kubernetes.io/projected/30a0ede3-8065-4152-aafe-b11c955a7ba6-kube-api-access-2trbr\") pod \"kserve-controller-manager-7f8f4564d-9rl7c\" (UID: \"30a0ede3-8065-4152-aafe-b11c955a7ba6\") " pod="kserve/kserve-controller-manager-7f8f4564d-9rl7c" Apr 16 17:48:55.983903 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:55.983816 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2trbr\" (UniqueName: \"kubernetes.io/projected/30a0ede3-8065-4152-aafe-b11c955a7ba6-kube-api-access-2trbr\") pod \"kserve-controller-manager-7f8f4564d-9rl7c\" (UID: \"30a0ede3-8065-4152-aafe-b11c955a7ba6\") " pod="kserve/kserve-controller-manager-7f8f4564d-9rl7c" Apr 16 17:48:55.983903 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:55.983895 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30a0ede3-8065-4152-aafe-b11c955a7ba6-cert\") pod \"kserve-controller-manager-7f8f4564d-9rl7c\" (UID: \"30a0ede3-8065-4152-aafe-b11c955a7ba6\") " pod="kserve/kserve-controller-manager-7f8f4564d-9rl7c" Apr 16 17:48:55.986493 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:55.986466 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30a0ede3-8065-4152-aafe-b11c955a7ba6-cert\") pod \"kserve-controller-manager-7f8f4564d-9rl7c\" (UID: \"30a0ede3-8065-4152-aafe-b11c955a7ba6\") " pod="kserve/kserve-controller-manager-7f8f4564d-9rl7c" Apr 16 17:48:55.991858 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:55.991833 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2trbr\" (UniqueName: \"kubernetes.io/projected/30a0ede3-8065-4152-aafe-b11c955a7ba6-kube-api-access-2trbr\") pod \"kserve-controller-manager-7f8f4564d-9rl7c\" (UID: \"30a0ede3-8065-4152-aafe-b11c955a7ba6\") " pod="kserve/kserve-controller-manager-7f8f4564d-9rl7c" Apr 16 17:48:56.040239 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:56.040199 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f8f4564d-9rl7c" Apr 16 17:48:56.158552 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:56.158524 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f8f4564d-9rl7c"] Apr 16 17:48:56.161744 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:48:56.161715 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30a0ede3_8065_4152_aafe_b11c955a7ba6.slice/crio-88d7b6460a9d0a1996650a51bc7caf7108f4d106c2847ae278861295d8d0c593 WatchSource:0}: Error finding container 88d7b6460a9d0a1996650a51bc7caf7108f4d106c2847ae278861295d8d0c593: Status 404 returned error can't find the container with id 88d7b6460a9d0a1996650a51bc7caf7108f4d106c2847ae278861295d8d0c593 Apr 16 17:48:56.475810 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:56.475773 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f8f4564d-9rl7c" event={"ID":"30a0ede3-8065-4152-aafe-b11c955a7ba6","Type":"ContainerStarted","Data":"88d7b6460a9d0a1996650a51bc7caf7108f4d106c2847ae278861295d8d0c593"} Apr 16 17:48:59.487704 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:59.487673 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f8f4564d-9rl7c" event={"ID":"30a0ede3-8065-4152-aafe-b11c955a7ba6","Type":"ContainerStarted","Data":"1616f85d5173067d923ab5aaaad53a11f2ce1b1f1587b588e60e9349e79a76ca"} Apr 16 17:48:59.488098 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:59.487814 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7f8f4564d-9rl7c" Apr 16 17:48:59.503840 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:48:59.503790 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7f8f4564d-9rl7c" podStartSLOduration=2.201885119 podStartE2EDuration="4.503776573s" podCreationTimestamp="2026-04-16 17:48:55 +0000 UTC" firstStartedPulling="2026-04-16 17:48:56.163061851 +0000 UTC m=+472.671451991" lastFinishedPulling="2026-04-16 17:48:58.464953303 +0000 UTC m=+474.973343445" observedRunningTime="2026-04-16 17:48:59.502488339 +0000 UTC m=+476.010878502" watchObservedRunningTime="2026-04-16 17:48:59.503776573 +0000 UTC m=+476.012166803" Apr 16 17:49:30.495056 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:49:30.495024 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7f8f4564d-9rl7c" Apr 16 17:49:46.741633 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:49:46.741547 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6978867857-bcqhn"] Apr 16 17:50:11.760892 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:11.760832 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6978867857-bcqhn" podUID="be977df4-a82e-439f-a7ac-81d911856465" containerName="console" containerID="cri-o://732bf8b1dd28d7564d0bb9a5d0601f04ca849782d8e92331b3986c6a719c01c2" gracePeriod=15 Apr 16 17:50:11.996426 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:11.996397 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6978867857-bcqhn_be977df4-a82e-439f-a7ac-81d911856465/console/0.log" Apr 16 17:50:11.996554 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:11.996458 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:50:12.070631 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.070541 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be977df4-a82e-439f-a7ac-81d911856465-console-serving-cert\") pod \"be977df4-a82e-439f-a7ac-81d911856465\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " Apr 16 17:50:12.070631 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.070592 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sw4f\" (UniqueName: \"kubernetes.io/projected/be977df4-a82e-439f-a7ac-81d911856465-kube-api-access-7sw4f\") pod \"be977df4-a82e-439f-a7ac-81d911856465\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " Apr 16 17:50:12.070631 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.070620 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-oauth-serving-cert\") pod \"be977df4-a82e-439f-a7ac-81d911856465\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " Apr 16 17:50:12.070897 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.070812 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-trusted-ca-bundle\") pod \"be977df4-a82e-439f-a7ac-81d911856465\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " Apr 16 17:50:12.070897 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.070853 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-service-ca\") pod \"be977df4-a82e-439f-a7ac-81d911856465\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " Apr 16 17:50:12.070897 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.070894 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-console-config\") pod \"be977df4-a82e-439f-a7ac-81d911856465\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " Apr 16 17:50:12.071047 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.070917 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be977df4-a82e-439f-a7ac-81d911856465-console-oauth-config\") pod \"be977df4-a82e-439f-a7ac-81d911856465\" (UID: \"be977df4-a82e-439f-a7ac-81d911856465\") " Apr 16 17:50:12.071047 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.070999 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "be977df4-a82e-439f-a7ac-81d911856465" (UID: "be977df4-a82e-439f-a7ac-81d911856465"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:50:12.071151 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.071134 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-oauth-serving-cert\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:50:12.071282 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.071254 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "be977df4-a82e-439f-a7ac-81d911856465" (UID: "be977df4-a82e-439f-a7ac-81d911856465"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:50:12.071418 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.071399 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-console-config" (OuterVolumeSpecName: "console-config") pod "be977df4-a82e-439f-a7ac-81d911856465" (UID: "be977df4-a82e-439f-a7ac-81d911856465"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:50:12.071498 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.071464 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-service-ca" (OuterVolumeSpecName: "service-ca") pod "be977df4-a82e-439f-a7ac-81d911856465" (UID: "be977df4-a82e-439f-a7ac-81d911856465"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:50:12.072847 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.072824 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be977df4-a82e-439f-a7ac-81d911856465-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "be977df4-a82e-439f-a7ac-81d911856465" (UID: "be977df4-a82e-439f-a7ac-81d911856465"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:50:12.072945 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.072917 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be977df4-a82e-439f-a7ac-81d911856465-kube-api-access-7sw4f" (OuterVolumeSpecName: "kube-api-access-7sw4f") pod "be977df4-a82e-439f-a7ac-81d911856465" (UID: "be977df4-a82e-439f-a7ac-81d911856465"). InnerVolumeSpecName "kube-api-access-7sw4f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:50:12.073008 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.072993 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be977df4-a82e-439f-a7ac-81d911856465-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "be977df4-a82e-439f-a7ac-81d911856465" (UID: "be977df4-a82e-439f-a7ac-81d911856465"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:50:12.176569 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.176518 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-trusted-ca-bundle\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:50:12.176740 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.176588 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-service-ca\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:50:12.176740 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.176715 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be977df4-a82e-439f-a7ac-81d911856465-console-config\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:50:12.176740 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.176724 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be977df4-a82e-439f-a7ac-81d911856465-console-oauth-config\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:50:12.176740 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.176734 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be977df4-a82e-439f-a7ac-81d911856465-console-serving-cert\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:50:12.176880 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.176743 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7sw4f\" (UniqueName: \"kubernetes.io/projected/be977df4-a82e-439f-a7ac-81d911856465-kube-api-access-7sw4f\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:50:12.709798 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.709772 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6978867857-bcqhn_be977df4-a82e-439f-a7ac-81d911856465/console/0.log" Apr 16 17:50:12.709981 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.709825 2577 generic.go:358] "Generic (PLEG): container finished" podID="be977df4-a82e-439f-a7ac-81d911856465" containerID="732bf8b1dd28d7564d0bb9a5d0601f04ca849782d8e92331b3986c6a719c01c2" exitCode=2 Apr 16 17:50:12.709981 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.709860 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6978867857-bcqhn" event={"ID":"be977df4-a82e-439f-a7ac-81d911856465","Type":"ContainerDied","Data":"732bf8b1dd28d7564d0bb9a5d0601f04ca849782d8e92331b3986c6a719c01c2"} Apr 16 17:50:12.709981 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.709900 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6978867857-bcqhn" event={"ID":"be977df4-a82e-439f-a7ac-81d911856465","Type":"ContainerDied","Data":"7600de6e13c09609853c2e461f256c1f22a617f267f3b35a6b58e483857eb140"} Apr 16 17:50:12.709981 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.709911 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6978867857-bcqhn" Apr 16 17:50:12.710167 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.709915 2577 scope.go:117] "RemoveContainer" containerID="732bf8b1dd28d7564d0bb9a5d0601f04ca849782d8e92331b3986c6a719c01c2" Apr 16 17:50:12.717920 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.717903 2577 scope.go:117] "RemoveContainer" containerID="732bf8b1dd28d7564d0bb9a5d0601f04ca849782d8e92331b3986c6a719c01c2" Apr 16 17:50:12.718155 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:50:12.718136 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"732bf8b1dd28d7564d0bb9a5d0601f04ca849782d8e92331b3986c6a719c01c2\": container with ID starting with 732bf8b1dd28d7564d0bb9a5d0601f04ca849782d8e92331b3986c6a719c01c2 not found: ID does not exist" containerID="732bf8b1dd28d7564d0bb9a5d0601f04ca849782d8e92331b3986c6a719c01c2" Apr 16 17:50:12.718209 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.718164 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"732bf8b1dd28d7564d0bb9a5d0601f04ca849782d8e92331b3986c6a719c01c2"} err="failed to get container status \"732bf8b1dd28d7564d0bb9a5d0601f04ca849782d8e92331b3986c6a719c01c2\": rpc error: code = NotFound desc = could not find container \"732bf8b1dd28d7564d0bb9a5d0601f04ca849782d8e92331b3986c6a719c01c2\": container with ID starting with 732bf8b1dd28d7564d0bb9a5d0601f04ca849782d8e92331b3986c6a719c01c2 not found: ID does not exist" Apr 16 17:50:12.725289 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.725269 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6978867857-bcqhn"] Apr 16 17:50:12.729764 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:12.729745 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6978867857-bcqhn"] Apr 16 17:50:14.120828 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:50:14.120796 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be977df4-a82e-439f-a7ac-81d911856465" path="/var/lib/kubelet/pods/be977df4-a82e-439f-a7ac-81d911856465/volumes" Apr 16 17:51:04.030242 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:51:04.030217 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-acl-logging/0.log" Apr 16 17:51:04.030771 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:51:04.030755 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-acl-logging/0.log" Apr 16 17:54:43.583720 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.583690 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq"] Apr 16 17:54:43.584122 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.583974 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be977df4-a82e-439f-a7ac-81d911856465" containerName="console" Apr 16 17:54:43.584122 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.583986 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="be977df4-a82e-439f-a7ac-81d911856465" containerName="console" Apr 16 17:54:43.584122 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.584034 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="be977df4-a82e-439f-a7ac-81d911856465" containerName="console" Apr 16 17:54:43.586878 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.586859 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" Apr 16 17:54:43.589043 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.589022 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-e4bae-serving-cert\"" Apr 16 17:54:43.589147 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.589045 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-e4bae-kube-rbac-proxy-sar-config\"" Apr 16 17:54:43.589366 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.589353 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 17:54:43.589743 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.589729 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-kfjrd\"" Apr 16 17:54:43.597233 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.597214 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq"] Apr 16 17:54:43.690316 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.690279 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1acc5a3-813d-4ea3-b676-1f71b93cfcd2-proxy-tls\") pod \"model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq\" (UID: \"d1acc5a3-813d-4ea3-b676-1f71b93cfcd2\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" Apr 16 17:54:43.690510 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.690325 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1acc5a3-813d-4ea3-b676-1f71b93cfcd2-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq\" (UID: \"d1acc5a3-813d-4ea3-b676-1f71b93cfcd2\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" Apr 16 17:54:43.791304 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.791266 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1acc5a3-813d-4ea3-b676-1f71b93cfcd2-proxy-tls\") pod \"model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq\" (UID: \"d1acc5a3-813d-4ea3-b676-1f71b93cfcd2\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" Apr 16 17:54:43.791513 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.791315 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1acc5a3-813d-4ea3-b676-1f71b93cfcd2-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq\" (UID: \"d1acc5a3-813d-4ea3-b676-1f71b93cfcd2\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" Apr 16 17:54:43.791977 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.791953 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1acc5a3-813d-4ea3-b676-1f71b93cfcd2-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq\" (UID: \"d1acc5a3-813d-4ea3-b676-1f71b93cfcd2\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" Apr 16 17:54:43.793751 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.793730 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1acc5a3-813d-4ea3-b676-1f71b93cfcd2-proxy-tls\") pod \"model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq\" (UID: \"d1acc5a3-813d-4ea3-b676-1f71b93cfcd2\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" Apr 16 17:54:43.896667 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:43.896563 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" Apr 16 17:54:44.014883 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:44.014850 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq"] Apr 16 17:54:44.018160 ip-10-0-143-234 kubenswrapper[2577]: W0416 17:54:44.018129 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1acc5a3_813d_4ea3_b676_1f71b93cfcd2.slice/crio-bf3c65d9397c463f80144a90216beeffd4c29fdf39a1eb5481e8c095dc9fc9e2 WatchSource:0}: Error finding container bf3c65d9397c463f80144a90216beeffd4c29fdf39a1eb5481e8c095dc9fc9e2: Status 404 returned error can't find the container with id bf3c65d9397c463f80144a90216beeffd4c29fdf39a1eb5481e8c095dc9fc9e2 Apr 16 17:54:44.019803 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:44.019788 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:54:44.460208 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:44.460175 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" event={"ID":"d1acc5a3-813d-4ea3-b676-1f71b93cfcd2","Type":"ContainerStarted","Data":"bf3c65d9397c463f80144a90216beeffd4c29fdf39a1eb5481e8c095dc9fc9e2"} Apr 16 17:54:46.466082 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:46.465994 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" event={"ID":"d1acc5a3-813d-4ea3-b676-1f71b93cfcd2","Type":"ContainerStarted","Data":"f156249e4061940f661d398234519cc1428dc4fe6efa72add1e89e152d76657c"} Apr 16 17:54:46.466468 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:46.466122 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" Apr 16 17:54:46.482386 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:46.482303 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" podStartSLOduration=1.379847384 podStartE2EDuration="3.482290575s" podCreationTimestamp="2026-04-16 17:54:43 +0000 UTC" firstStartedPulling="2026-04-16 17:54:44.019907538 +0000 UTC m=+820.528297681" lastFinishedPulling="2026-04-16 17:54:46.122350731 +0000 UTC m=+822.630740872" observedRunningTime="2026-04-16 17:54:46.481931522 +0000 UTC m=+822.990321685" watchObservedRunningTime="2026-04-16 17:54:46.482290575 +0000 UTC m=+822.990680758" Apr 16 17:54:52.474030 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:52.474001 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" Apr 16 17:54:53.623500 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:53.623466 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq"] Apr 16 17:54:53.623856 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:53.623696 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" podUID="d1acc5a3-813d-4ea3-b676-1f71b93cfcd2" containerName="model-chainer-raw-hpa-e4bae" containerID="cri-o://f156249e4061940f661d398234519cc1428dc4fe6efa72add1e89e152d76657c" gracePeriod=30 Apr 16 17:54:57.473560 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:54:57.473525 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" podUID="d1acc5a3-813d-4ea3-b676-1f71b93cfcd2" containerName="model-chainer-raw-hpa-e4bae" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:55:02.473731 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:02.473692 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" podUID="d1acc5a3-813d-4ea3-b676-1f71b93cfcd2" containerName="model-chainer-raw-hpa-e4bae" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:55:07.472996 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:07.472961 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" podUID="d1acc5a3-813d-4ea3-b676-1f71b93cfcd2" containerName="model-chainer-raw-hpa-e4bae" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:55:07.473408 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:07.473071 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" Apr 16 17:55:12.473552 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:12.473505 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" podUID="d1acc5a3-813d-4ea3-b676-1f71b93cfcd2" containerName="model-chainer-raw-hpa-e4bae" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:55:17.473737 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:17.473701 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" podUID="d1acc5a3-813d-4ea3-b676-1f71b93cfcd2" containerName="model-chainer-raw-hpa-e4bae" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:55:22.473730 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:22.473676 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" podUID="d1acc5a3-813d-4ea3-b676-1f71b93cfcd2" containerName="model-chainer-raw-hpa-e4bae" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:55:23.643677 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:55:23.643647 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1acc5a3_813d_4ea3_b676_1f71b93cfcd2.slice/crio-conmon-f156249e4061940f661d398234519cc1428dc4fe6efa72add1e89e152d76657c.scope\": RecentStats: unable to find data in memory cache]" Apr 16 17:55:23.643677 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:55:23.643663 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1acc5a3_813d_4ea3_b676_1f71b93cfcd2.slice/crio-conmon-f156249e4061940f661d398234519cc1428dc4fe6efa72add1e89e152d76657c.scope\": RecentStats: unable to find data in memory cache]" Apr 16 17:55:23.771657 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:23.771634 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" Apr 16 17:55:23.871525 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:23.871500 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1acc5a3-813d-4ea3-b676-1f71b93cfcd2-openshift-service-ca-bundle\") pod \"d1acc5a3-813d-4ea3-b676-1f71b93cfcd2\" (UID: \"d1acc5a3-813d-4ea3-b676-1f71b93cfcd2\") " Apr 16 17:55:23.871674 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:23.871562 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1acc5a3-813d-4ea3-b676-1f71b93cfcd2-proxy-tls\") pod \"d1acc5a3-813d-4ea3-b676-1f71b93cfcd2\" (UID: \"d1acc5a3-813d-4ea3-b676-1f71b93cfcd2\") " Apr 16 17:55:23.871842 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:23.871807 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1acc5a3-813d-4ea3-b676-1f71b93cfcd2-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d1acc5a3-813d-4ea3-b676-1f71b93cfcd2" (UID: "d1acc5a3-813d-4ea3-b676-1f71b93cfcd2"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:55:23.873765 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:23.873745 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1acc5a3-813d-4ea3-b676-1f71b93cfcd2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d1acc5a3-813d-4ea3-b676-1f71b93cfcd2" (UID: "d1acc5a3-813d-4ea3-b676-1f71b93cfcd2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:55:23.972376 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:23.972350 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1acc5a3-813d-4ea3-b676-1f71b93cfcd2-proxy-tls\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:55:23.972376 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:23.972371 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1acc5a3-813d-4ea3-b676-1f71b93cfcd2-openshift-service-ca-bundle\") on node \"ip-10-0-143-234.ec2.internal\" DevicePath \"\"" Apr 16 17:55:24.575983 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:24.575957 2577 generic.go:358] "Generic (PLEG): container finished" podID="d1acc5a3-813d-4ea3-b676-1f71b93cfcd2" containerID="f156249e4061940f661d398234519cc1428dc4fe6efa72add1e89e152d76657c" exitCode=0 Apr 16 17:55:24.576138 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:24.576016 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" event={"ID":"d1acc5a3-813d-4ea3-b676-1f71b93cfcd2","Type":"ContainerDied","Data":"f156249e4061940f661d398234519cc1428dc4fe6efa72add1e89e152d76657c"} Apr 16 17:55:24.576138 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:24.576024 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" Apr 16 17:55:24.576138 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:24.576042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq" event={"ID":"d1acc5a3-813d-4ea3-b676-1f71b93cfcd2","Type":"ContainerDied","Data":"bf3c65d9397c463f80144a90216beeffd4c29fdf39a1eb5481e8c095dc9fc9e2"} Apr 16 17:55:24.576138 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:24.576057 2577 scope.go:117] "RemoveContainer" containerID="f156249e4061940f661d398234519cc1428dc4fe6efa72add1e89e152d76657c" Apr 16 17:55:24.584195 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:24.584177 2577 scope.go:117] "RemoveContainer" containerID="f156249e4061940f661d398234519cc1428dc4fe6efa72add1e89e152d76657c" Apr 16 17:55:24.584426 ip-10-0-143-234 kubenswrapper[2577]: E0416 17:55:24.584409 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f156249e4061940f661d398234519cc1428dc4fe6efa72add1e89e152d76657c\": container with ID starting with f156249e4061940f661d398234519cc1428dc4fe6efa72add1e89e152d76657c not found: ID does not exist" containerID="f156249e4061940f661d398234519cc1428dc4fe6efa72add1e89e152d76657c" Apr 16 17:55:24.584481 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:24.584435 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f156249e4061940f661d398234519cc1428dc4fe6efa72add1e89e152d76657c"} err="failed to get container status \"f156249e4061940f661d398234519cc1428dc4fe6efa72add1e89e152d76657c\": rpc error: code = NotFound desc = could not find container \"f156249e4061940f661d398234519cc1428dc4fe6efa72add1e89e152d76657c\": container with ID starting with f156249e4061940f661d398234519cc1428dc4fe6efa72add1e89e152d76657c not found: ID does not exist" Apr 16 17:55:24.590205 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:24.590183 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq"] Apr 16 17:55:24.597987 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:24.597964 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-e4bae-546dd95b78-2gpkq"] Apr 16 17:55:26.121304 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:55:26.121272 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1acc5a3-813d-4ea3-b676-1f71b93cfcd2" path="/var/lib/kubelet/pods/d1acc5a3-813d-4ea3-b676-1f71b93cfcd2/volumes" Apr 16 17:56:04.049034 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:56:04.048419 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-acl-logging/0.log" Apr 16 17:56:04.051884 ip-10-0-143-234 kubenswrapper[2577]: I0416 17:56:04.051865 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-acl-logging/0.log" Apr 16 18:01:04.071434 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:01:04.071403 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-acl-logging/0.log" Apr 16 18:01:04.073211 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:01:04.073190 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-acl-logging/0.log" Apr 16 18:03:07.671262 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:07.671229 2577 ???:1] "http: TLS handshake error from 10.0.140.62:59770: EOF" Apr 16 18:03:07.674687 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:07.674665 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7t8dz_d66127b0-6df7-4368-bf73-d0b830421d6c/global-pull-secret-syncer/0.log" Apr 16 18:03:07.772953 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:07.772925 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2xwww_3d913013-d26e-4756-9b14-4e6907f4baf0/konnectivity-agent/0.log" Apr 16 18:03:07.907510 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:07.907479 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-234.ec2.internal_0ed94319a7f3740b078962730ca47007/haproxy/0.log" Apr 16 18:03:11.108430 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:11.108400 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258/alertmanager/0.log" Apr 16 18:03:11.137073 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:11.137038 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258/config-reloader/0.log" Apr 16 18:03:11.163443 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:11.163414 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258/kube-rbac-proxy-web/0.log" Apr 16 18:03:11.190613 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:11.190590 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258/kube-rbac-proxy/0.log" Apr 16 18:03:11.218045 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:11.218025 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258/kube-rbac-proxy-metric/0.log" Apr 16 18:03:11.243487 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:11.243468 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258/prom-label-proxy/0.log" Apr 16 18:03:11.270908 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:11.270888 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d71fe6d1-b0f9-42ae-9c9d-3fd2d3e1a258/init-config-reloader/0.log" Apr 16 18:03:11.446255 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:11.446187 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-9m42m_cd1706f9-bd64-44a9-bb23-a64284d2567a/monitoring-plugin/0.log" Apr 16 18:03:11.480178 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:11.480151 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-c9jqz_208f38dc-f3ce-4e79-b9b1-1106f65c0831/node-exporter/0.log" Apr 16 18:03:11.513467 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:11.513444 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-c9jqz_208f38dc-f3ce-4e79-b9b1-1106f65c0831/kube-rbac-proxy/0.log" Apr 16 18:03:11.536406 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:11.536385 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-c9jqz_208f38dc-f3ce-4e79-b9b1-1106f65c0831/init-textfile/0.log" Apr 16 18:03:13.340795 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:13.340768 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-46ktl_ab1c4c42-78b6-4a7e-aadc-83a1b22e89c1/networking-console-plugin/0.log" Apr 16 18:03:14.867796 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:14.867767 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b"] Apr 16 18:03:14.868184 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:14.868073 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1acc5a3-813d-4ea3-b676-1f71b93cfcd2" containerName="model-chainer-raw-hpa-e4bae" Apr 16 18:03:14.868184 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:14.868086 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1acc5a3-813d-4ea3-b676-1f71b93cfcd2" containerName="model-chainer-raw-hpa-e4bae" Apr 16 18:03:14.868184 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:14.868151 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1acc5a3-813d-4ea3-b676-1f71b93cfcd2" containerName="model-chainer-raw-hpa-e4bae" Apr 16 18:03:14.871244 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:14.871227 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:14.873503 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:14.873478 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xznzl\"/\"default-dockercfg-b8sdr\"" Apr 16 18:03:14.873603 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:14.873525 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xznzl\"/\"openshift-service-ca.crt\"" Apr 16 18:03:14.873603 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:14.873576 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xznzl\"/\"kube-root-ca.crt\"" Apr 16 18:03:14.879397 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:14.879374 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b"] Apr 16 18:03:14.937644 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:14.937598 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e36dc923-acf2-4f61-916b-41276c8154fd-sys\") pod \"perf-node-gather-daemonset-gzk6b\" (UID: \"e36dc923-acf2-4f61-916b-41276c8154fd\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:14.937644 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:14.937648 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e36dc923-acf2-4f61-916b-41276c8154fd-lib-modules\") pod \"perf-node-gather-daemonset-gzk6b\" (UID: \"e36dc923-acf2-4f61-916b-41276c8154fd\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:14.937847 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:14.937680 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gncfh\" (UniqueName: \"kubernetes.io/projected/e36dc923-acf2-4f61-916b-41276c8154fd-kube-api-access-gncfh\") pod \"perf-node-gather-daemonset-gzk6b\" (UID: \"e36dc923-acf2-4f61-916b-41276c8154fd\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:14.937847 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:14.937712 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e36dc923-acf2-4f61-916b-41276c8154fd-podres\") pod \"perf-node-gather-daemonset-gzk6b\" (UID: \"e36dc923-acf2-4f61-916b-41276c8154fd\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:14.937847 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:14.937736 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e36dc923-acf2-4f61-916b-41276c8154fd-proc\") pod \"perf-node-gather-daemonset-gzk6b\" (UID: \"e36dc923-acf2-4f61-916b-41276c8154fd\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:15.038515 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.038469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e36dc923-acf2-4f61-916b-41276c8154fd-lib-modules\") pod \"perf-node-gather-daemonset-gzk6b\" (UID: \"e36dc923-acf2-4f61-916b-41276c8154fd\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:15.038515 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.038538 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gncfh\" (UniqueName: \"kubernetes.io/projected/e36dc923-acf2-4f61-916b-41276c8154fd-kube-api-access-gncfh\") pod \"perf-node-gather-daemonset-gzk6b\" (UID: \"e36dc923-acf2-4f61-916b-41276c8154fd\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:15.038775 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.038568 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e36dc923-acf2-4f61-916b-41276c8154fd-podres\") pod \"perf-node-gather-daemonset-gzk6b\" (UID: \"e36dc923-acf2-4f61-916b-41276c8154fd\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:15.038775 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.038592 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e36dc923-acf2-4f61-916b-41276c8154fd-proc\") pod \"perf-node-gather-daemonset-gzk6b\" (UID: \"e36dc923-acf2-4f61-916b-41276c8154fd\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:15.038775 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.038623 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e36dc923-acf2-4f61-916b-41276c8154fd-sys\") pod \"perf-node-gather-daemonset-gzk6b\" (UID: \"e36dc923-acf2-4f61-916b-41276c8154fd\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:15.038775 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.038659 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e36dc923-acf2-4f61-916b-41276c8154fd-lib-modules\") pod \"perf-node-gather-daemonset-gzk6b\" (UID: \"e36dc923-acf2-4f61-916b-41276c8154fd\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:15.038775 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.038704 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e36dc923-acf2-4f61-916b-41276c8154fd-sys\") pod \"perf-node-gather-daemonset-gzk6b\" (UID: \"e36dc923-acf2-4f61-916b-41276c8154fd\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:15.038775 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.038724 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e36dc923-acf2-4f61-916b-41276c8154fd-proc\") pod \"perf-node-gather-daemonset-gzk6b\" (UID: \"e36dc923-acf2-4f61-916b-41276c8154fd\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:15.038775 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.038730 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e36dc923-acf2-4f61-916b-41276c8154fd-podres\") pod \"perf-node-gather-daemonset-gzk6b\" (UID: \"e36dc923-acf2-4f61-916b-41276c8154fd\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:15.045832 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.045804 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gncfh\" (UniqueName: \"kubernetes.io/projected/e36dc923-acf2-4f61-916b-41276c8154fd-kube-api-access-gncfh\") pod \"perf-node-gather-daemonset-gzk6b\" (UID: \"e36dc923-acf2-4f61-916b-41276c8154fd\") " pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:15.181744 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.181638 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:15.195841 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.195816 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6l87f_d0738358-399f-4f84-8552-0728eba20372/dns/0.log" Apr 16 18:03:15.217541 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.217508 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6l87f_d0738358-399f-4f84-8552-0728eba20372/kube-rbac-proxy/0.log" Apr 16 18:03:15.304407 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.304373 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b"] Apr 16 18:03:15.308041 ip-10-0-143-234 kubenswrapper[2577]: W0416 18:03:15.308013 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode36dc923_acf2_4f61_916b_41276c8154fd.slice/crio-0b1b6af61628d51080418ee332c341b0277e134cb828e16e3c9230529936dbe5 WatchSource:0}: Error finding container 0b1b6af61628d51080418ee332c341b0277e134cb828e16e3c9230529936dbe5: Status 404 returned error can't find the container with id 0b1b6af61628d51080418ee332c341b0277e134cb828e16e3c9230529936dbe5 Apr 16 18:03:15.309657 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.309640 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:03:15.335488 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.335465 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vrkbq_2e01f328-7d13-47e4-ba26-d47919ca94fb/dns-node-resolver/0.log" Apr 16 18:03:15.768024 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.767995 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pv5jg_8643560d-c751-40a2-a84e-fd9619f0a198/node-ca/0.log" Apr 16 18:03:15.872511 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.872474 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" event={"ID":"e36dc923-acf2-4f61-916b-41276c8154fd","Type":"ContainerStarted","Data":"c418f9aa97c4d1174d893de90a6db62003b081eb79c1e88e7560b8ea2e98a41a"} Apr 16 18:03:15.872511 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.872510 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" event={"ID":"e36dc923-acf2-4f61-916b-41276c8154fd","Type":"ContainerStarted","Data":"0b1b6af61628d51080418ee332c341b0277e134cb828e16e3c9230529936dbe5"} Apr 16 18:03:15.873033 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.872625 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:15.887885 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:15.887831 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" podStartSLOduration=1.887818126 podStartE2EDuration="1.887818126s" podCreationTimestamp="2026-04-16 18:03:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:03:15.886516312 +0000 UTC m=+1332.394906472" watchObservedRunningTime="2026-04-16 18:03:15.887818126 +0000 UTC m=+1332.396208289" Apr 16 18:03:16.838985 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:16.838952 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zgfj4_573f0e79-0a24-47b1-9570-15a67f037365/serve-healthcheck-canary/0.log" Apr 16 18:03:17.291398 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:17.291302 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5p4pc_aeebceb2-b35a-4208-ae0d-f95a63aa4920/kube-rbac-proxy/0.log" Apr 16 18:03:17.329518 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:17.329491 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5p4pc_aeebceb2-b35a-4208-ae0d-f95a63aa4920/exporter/0.log" Apr 16 18:03:17.364853 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:17.364825 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5p4pc_aeebceb2-b35a-4208-ae0d-f95a63aa4920/extractor/0.log" Apr 16 18:03:19.331958 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:19.331926 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7f8f4564d-9rl7c_30a0ede3-8065-4152-aafe-b11c955a7ba6/manager/0.log" Apr 16 18:03:19.352483 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:19.352450 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-smvq2_38612e0f-1afe-426e-87a6-94b1b57fb86f/manager/0.log" Apr 16 18:03:19.571707 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:19.571672 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-8hnfr_3586c7d3-442f-4bca-b15e-f5ed0ab2d6aa/seaweedfs/0.log" Apr 16 18:03:21.884650 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:21.884622 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xznzl/perf-node-gather-daemonset-gzk6b" Apr 16 18:03:24.630486 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:24.630453 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s72ln_f243dd2e-6d7f-4c1b-9ec7-346a02c79bba/kube-multus-additional-cni-plugins/0.log" Apr 16 18:03:24.653928 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:24.653897 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s72ln_f243dd2e-6d7f-4c1b-9ec7-346a02c79bba/egress-router-binary-copy/0.log" Apr 16 18:03:24.675325 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:24.675297 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s72ln_f243dd2e-6d7f-4c1b-9ec7-346a02c79bba/cni-plugins/0.log" Apr 16 18:03:24.699949 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:24.699921 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s72ln_f243dd2e-6d7f-4c1b-9ec7-346a02c79bba/bond-cni-plugin/0.log" Apr 16 18:03:24.723875 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:24.723851 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s72ln_f243dd2e-6d7f-4c1b-9ec7-346a02c79bba/routeoverride-cni/0.log" Apr 16 18:03:24.746653 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:24.746625 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s72ln_f243dd2e-6d7f-4c1b-9ec7-346a02c79bba/whereabouts-cni-bincopy/0.log" Apr 16 18:03:24.767973 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:24.767956 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s72ln_f243dd2e-6d7f-4c1b-9ec7-346a02c79bba/whereabouts-cni/0.log" Apr 16 18:03:24.973655 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:24.973630 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q4vbk_6ed2c2e6-5851-4969-afa8-f8336c09ee54/kube-multus/0.log" Apr 16 18:03:24.995266 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:24.995239 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gg8gs_eccdd8a8-ee59-4c3c-852e-f012ce698554/network-metrics-daemon/0.log" Apr 16 18:03:25.015019 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:25.014997 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gg8gs_eccdd8a8-ee59-4c3c-852e-f012ce698554/kube-rbac-proxy/0.log" Apr 16 18:03:26.412642 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:26.412563 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-controller/0.log" Apr 16 18:03:26.434239 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:26.434215 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-acl-logging/0.log" Apr 16 18:03:26.440066 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:26.440051 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovn-acl-logging/1.log" Apr 16 18:03:26.466964 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:26.466939 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/kube-rbac-proxy-node/0.log" Apr 16 18:03:26.490534 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:26.490506 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:03:26.511384 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:26.511355 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/northd/0.log" Apr 16 18:03:26.532966 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:26.532942 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/nbdb/0.log" Apr 16 18:03:26.555568 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:26.555544 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/sbdb/0.log" Apr 16 18:03:26.638660 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:26.638624 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g5src_2a931acd-9936-4d4e-a3b6-d2d86cb92da4/ovnkube-controller/0.log" Apr 16 18:03:27.528078 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:27.528048 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-zll94_f5699995-82fb-44e3-a47d-70164f1e97cd/check-endpoints/0.log" Apr 16 18:03:27.578123 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:27.578096 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vwz6h_2436cc07-66d7-4793-9260-5c3585aae363/network-check-target-container/0.log" Apr 16 18:03:28.442781 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:28.442754 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-2krk7_24061e65-3c69-48d6-8110-9c66fb64e102/iptables-alerter/0.log" Apr 16 18:03:29.094738 ip-10-0-143-234 kubenswrapper[2577]: I0416 18:03:29.094681 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-dfv6z_f144295e-123d-49ad-96f0-a793fc10f2bd/tuned/0.log"