Mar 18 16:44:16.848142 ip-10-0-137-219 systemd[1]: Starting Kubernetes Kubelet... Mar 18 16:44:17.355307 ip-10-0-137-219 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:17.355307 ip-10-0-137-219 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 16:44:17.355307 ip-10-0-137-219 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:17.355307 ip-10-0-137-219 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 18 16:44:17.355307 ip-10-0-137-219 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:17.357104 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.357014 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 16:44:17.360228 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360206 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:17.360228 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360229 2571 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360233 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360236 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360239 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360244 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360249 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360252 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360255 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360258 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360262 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360265 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360269 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360273 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360276 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360279 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360282 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360285 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360287 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360290 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:17.360303 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360293 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360296 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360299 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360302 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360305 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360308 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360311 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360314 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360317 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360320 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360323 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360325 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360328 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360331 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360334 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360337 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360339 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360344 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360347 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360358 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:17.360865 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360362 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360364 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360367 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360370 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360373 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360376 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360379 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360381 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360384 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360387 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360389 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360392 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360395 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360398 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360401 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360404 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360406 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360409 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360412 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360415 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:17.361439 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360418 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360420 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360423 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360426 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360429 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360431 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360434 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360437 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360439 2571 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360443 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360446 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360449 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360455 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360457 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360460 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360463 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360466 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360469 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360472 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:17.362002 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360474 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360477 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360480 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360483 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360486 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360489 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360491 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360979 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360986 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360989 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360993 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360995 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.360998 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361001 2571 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361004 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361008 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361010 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361013 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361015 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361018 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:17.362518 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361021 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361024 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361026 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361029 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361032 2571 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361035 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361038 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361041 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361043 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361047 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361049 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361052 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361055 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361057 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361060 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361063 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361065 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361068 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361071 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361074 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:17.363120 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361076 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361080 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361084 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361088 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361092 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361094 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361097 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361100 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361103 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361107 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361109 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361112 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361115 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361118 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361120 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361123 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361126 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361129 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361131 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:17.363730 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361134 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361136 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361140 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361142 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361145 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361148 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361150 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361153 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361155 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361158 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361160 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361163 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361166 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361169 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361171 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361174 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361177 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361180 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361183 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361185 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:17.364286 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361188 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361191 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361194 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361197 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361199 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361202 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361204 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361207 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361209 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361212 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361214 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361218 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361220 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361223 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361300 2571 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361308 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361318 2571 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361324 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361331 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361336 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361342 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 16:44:17.364862 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361347 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361351 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361354 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361358 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361362 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361365 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361368 2571 flags.go:64] FLAG: --cgroup-root="" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361371 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361374 2571 flags.go:64] FLAG: --client-ca-file="" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361377 2571 flags.go:64] FLAG: --cloud-config="" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361380 2571 flags.go:64] FLAG: --cloud-provider="external" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361384 2571 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361388 2571 flags.go:64] FLAG: --cluster-domain="" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361391 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361394 2571 flags.go:64] FLAG: --config-dir="" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361397 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361401 2571 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361405 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361408 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361412 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361416 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361419 2571 flags.go:64] FLAG: --contention-profiling="false" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361422 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361426 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361429 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 16:44:17.365393 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361432 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361437 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361440 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361443 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361446 2571 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361455 2571 flags.go:64] FLAG: --enable-server="true" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361458 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361463 2571 flags.go:64] FLAG: --event-burst="100" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361466 2571 flags.go:64] FLAG: --event-qps="50" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361469 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361474 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361477 2571 flags.go:64] FLAG: --eviction-hard="" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361481 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361484 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361487 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361490 2571 flags.go:64] FLAG: --eviction-soft="" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361493 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361496 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361499 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361502 2571 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361505 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361508 2571 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361511 2571 flags.go:64] FLAG: --feature-gates="" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361516 2571 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361519 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 16:44:17.366026 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361522 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361525 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361529 2571 flags.go:64] FLAG: --healthz-port="10248" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361532 2571 flags.go:64] FLAG: --help="false" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361535 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361539 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361542 2571 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361545 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361549 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361552 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361555 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361558 2571 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361563 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361566 2571 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361569 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361572 2571 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361577 2571 flags.go:64] FLAG: --kube-reserved="" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361580 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361583 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361586 2571 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361589 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361592 2571 flags.go:64] FLAG: --lock-file="" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361595 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361598 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 16:44:17.366667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361602 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361607 2571 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361610 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361614 2571 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361617 2571 flags.go:64] FLAG: --logging-format="text" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361620 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361623 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361626 2571 flags.go:64] FLAG: --manifest-url="" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361629 2571 flags.go:64] FLAG: --manifest-url-header="" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361634 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361637 2571 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361642 2571 flags.go:64] FLAG: --max-pods="110" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361645 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361648 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361651 2571 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361654 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361657 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361660 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361663 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361672 2571 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361677 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361680 2571 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361684 2571 flags.go:64] FLAG: --pod-cidr="" Mar 18 16:44:17.367278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361687 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b3115b2610585407ab0742648cfbe39c72f57482889f0e778f5ac6fdc482217b" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361695 2571 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361698 2571 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361702 2571 flags.go:64] FLAG: --pods-per-core="0" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361705 2571 flags.go:64] FLAG: --port="10250" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361708 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361711 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03c2b3c3a4d3c51a2" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361715 2571 flags.go:64] FLAG: --qos-reserved="" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361718 2571 flags.go:64] FLAG: --read-only-port="10255" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361721 2571 flags.go:64] FLAG: --register-node="true" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361724 2571 flags.go:64] FLAG: --register-schedulable="true" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361727 2571 flags.go:64] FLAG: --register-with-taints="" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361731 2571 flags.go:64] FLAG: --registry-burst="10" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361734 2571 flags.go:64] FLAG: --registry-qps="5" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361737 2571 flags.go:64] FLAG: --reserved-cpus="" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361740 2571 flags.go:64] FLAG: --reserved-memory="" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361744 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361747 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361750 2571 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361753 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361756 2571 flags.go:64] FLAG: --runonce="false" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361759 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361762 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361766 2571 flags.go:64] FLAG: --seccomp-default="false" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361769 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361773 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 16:44:17.367864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361776 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361779 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361782 2571 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361785 2571 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361790 2571 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361793 2571 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361796 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361800 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361803 2571 flags.go:64] FLAG: --system-cgroups="" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361806 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361812 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361815 2571 flags.go:64] FLAG: --tls-cert-file="" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361833 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361838 2571 flags.go:64] FLAG: --tls-min-version="" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361841 2571 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361844 2571 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361847 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361850 2571 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361853 2571 flags.go:64] FLAG: --v="2" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361858 2571 flags.go:64] FLAG: --version="false" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361862 2571 flags.go:64] FLAG: --vmodule="" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361866 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.361870 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361989 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:17.368552 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361993 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361996 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.361999 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362002 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362005 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362007 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362011 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362014 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362017 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362020 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362023 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362026 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362031 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362034 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362037 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362041 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362044 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362047 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362049 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:17.369581 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362052 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362055 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362057 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362060 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362064 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362068 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362071 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362074 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362077 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362079 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362082 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362085 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362087 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362090 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362093 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362095 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362098 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362100 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362103 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362105 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:17.370402 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362108 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362111 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362114 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362117 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362119 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362123 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362126 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362128 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362133 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362135 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362138 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362142 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362146 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362149 2571 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362152 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362155 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362158 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362162 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362165 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:17.371234 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362168 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362170 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362173 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362176 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362179 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362182 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362185 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362188 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362191 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362194 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362197 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362199 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362202 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362204 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362207 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362209 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362212 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362215 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362220 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362222 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:17.371734 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362225 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362229 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362232 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362235 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362238 2571 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362240 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.362243 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.363043 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.371391 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.371413 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371477 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371482 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371485 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371489 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371492 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371495 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:17.372314 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371498 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371501 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371504 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371507 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371509 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371512 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371515 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371518 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371521 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371523 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371526 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371529 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371533 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371538 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371542 2571 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371545 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371549 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371553 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371556 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:17.372720 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371560 2571 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371563 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371566 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371569 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371572 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371574 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371577 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371580 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371582 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371585 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371588 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371590 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371593 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371596 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371598 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371601 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371604 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371606 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371609 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:17.373217 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371611 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371614 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371617 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371619 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371622 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371625 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371627 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371630 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371632 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371635 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371638 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371641 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371644 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371646 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371649 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371651 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371654 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371657 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371659 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371662 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:17.373737 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371665 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371667 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371670 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371672 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371675 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371677 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371680 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371683 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371685 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371688 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371690 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371694 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371698 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371701 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371704 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371706 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371709 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371712 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371714 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:17.374253 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371717 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:17.374736 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371720 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:17.374736 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371722 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:17.374736 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.371728 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:17.374736 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371861 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:17.374736 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371868 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:17.374736 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371871 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:17.374736 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371874 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:17.374736 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371877 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:17.374736 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371880 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:17.374736 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371883 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:17.374736 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371886 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:17.374736 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371890 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:17.374736 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371892 2571 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:17.374736 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371895 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:17.374736 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371898 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371901 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371903 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371906 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371909 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371912 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371915 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371918 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371920 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371923 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371926 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371934 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371937 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371939 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371942 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371945 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371948 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371950 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371953 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371956 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:17.375194 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371958 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371961 2571 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371964 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371967 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371970 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371972 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371975 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371978 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371980 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371983 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371986 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371989 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371992 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.371996 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372000 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372004 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372007 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372011 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372014 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:17.375682 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372016 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372019 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372022 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372025 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372028 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372031 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372034 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372037 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372039 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372042 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372045 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372048 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372051 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372054 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372056 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372059 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372062 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372064 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372067 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372070 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:17.376301 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372072 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:17.376897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372075 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:17.376897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372078 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:17.376897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372081 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:17.376897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372083 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:17.376897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372086 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:17.376897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372089 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:17.376897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372091 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:17.376897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372094 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:17.376897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372097 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:17.376897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372100 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:17.376897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372102 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:17.376897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372105 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:17.376897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372108 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:17.376897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372110 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:17.376897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:17.372113 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:17.377271 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.372118 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:17.377271 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.372936 2571 server.go:962] "Client rotation is on, will bootstrap in background" Mar 18 16:44:17.377271 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.376341 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 16:44:17.377466 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.377452 2571 server.go:1019] "Starting client certificate rotation" Mar 18 16:44:17.377572 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.377552 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:17.377608 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.377596 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:17.406540 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.406512 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:17.410391 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.410371 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:17.430011 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.429984 2571 log.go:25] "Validated CRI v1 runtime API" Mar 18 16:44:17.436386 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.436365 2571 log.go:25] "Validated CRI v1 image API" Mar 18 16:44:17.437866 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.437849 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 16:44:17.439113 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.439095 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:17.442343 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.442320 2571 fs.go:135] Filesystem UUIDs: map[1b47c4c5-377f-4f5d-8e80-c7ea3ecf9f40:/dev/nvme0n1p3 35e0f00f-8c1a-4b6d-b6db-012fb040ac70:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Mar 18 16:44:17.442411 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.442343 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 18 16:44:17.449815 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.449456 2571 manager.go:217] Machine: {Timestamp:2026-03-18 16:44:17.446582316 +0000 UTC m=+0.466242575 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3103584 MemoryCapacity:33164480512 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27a0acde63a74ea077826c0d870f33 SystemUUID:ec27a0ac-de63-a74e-a077-826c0d870f33 BootID:f04d04d8-499e-497c-9e58-2ebe5d703a6c Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6094848 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:0a:9b:a8:0c:93 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:0a:9b:a8:0c:93 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:02:35:72:11:88:4f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164480512 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 16:44:17.449815 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.449800 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 16:44:17.449976 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.449927 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.96.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260303-1 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 16:44:17.451247 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.451221 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 16:44:17.451402 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.451252 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-219.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 16:44:17.451444 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.451412 2571 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 16:44:17.451444 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.451425 2571 container_manager_linux.go:306] "Creating device plugin manager" Mar 18 16:44:17.451444 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.451440 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:17.452373 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.452362 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:17.454151 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.454137 2571 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:17.454270 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.454261 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 18 16:44:17.457908 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.457893 2571 kubelet.go:491] "Attempting to sync node with API server" Mar 18 16:44:17.457948 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.457912 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 16:44:17.457948 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.457927 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 16:44:17.457948 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.457938 2571 kubelet.go:397] "Adding apiserver pod source" Mar 18 16:44:17.457948 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.457949 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 16:44:17.459247 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.459231 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:17.459303 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.459252 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:17.462714 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.462698 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.9-3.rhaos4.20.gitb9ac835.el9" apiVersion="v1" Mar 18 16:44:17.464923 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.464896 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 18 16:44:17.466844 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.466830 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 16:44:17.466897 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.466853 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 16:44:17.466897 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.466863 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 16:44:17.466897 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.466870 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 16:44:17.466897 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.466876 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 16:44:17.466897 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.466882 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 16:44:17.466897 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.466888 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 16:44:17.466897 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.466893 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 16:44:17.466897 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.466901 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 16:44:17.467120 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.466907 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 16:44:17.467120 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.466916 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 16:44:17.467120 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.466926 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 16:44:17.468325 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.468313 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 16:44:17.468378 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.468327 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 18 16:44:17.471047 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.471021 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-219.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 18 16:44:17.471047 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.471038 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 18 16:44:17.472183 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.472169 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 18 16:44:17.472247 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.472210 2571 server.go:1295] "Started kubelet" Mar 18 16:44:17.472299 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.472270 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 16:44:17.472391 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.472334 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 16:44:17.472434 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.472410 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 18 16:44:17.473075 ip-10-0-137-219 systemd[1]: Started Kubernetes Kubelet. Mar 18 16:44:17.474008 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.473962 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 16:44:17.475715 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.475700 2571 server.go:317] "Adding debug handlers to kubelet server" Mar 18 16:44:17.481776 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.481752 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-219.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 16:44:17.482743 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.482722 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 18 16:44:17.482871 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.481753 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-219.ec2.internal.189dfd3adaa779e8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-219.ec2.internal,UID:ip-10-0-137-219.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-219.ec2.internal,},FirstTimestamp:2026-03-18 16:44:17.47218276 +0000 UTC m=+0.491843018,LastTimestamp:2026-03-18 16:44:17.47218276 +0000 UTC m=+0.491843018,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-219.ec2.internal,}" Mar 18 16:44:17.483846 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.483811 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:17.484474 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.484458 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 16:44:17.485179 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.485158 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-g8w29" Mar 18 16:44:17.485264 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.485241 2571 factory.go:153] Registering CRI-O factory Mar 18 16:44:17.485264 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.485255 2571 factory.go:223] Registration of the crio container factory successfully Mar 18 16:44:17.485367 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.485320 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 16:44:17.485367 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.485329 2571 factory.go:55] Registering systemd factory Mar 18 16:44:17.485367 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.485338 2571 factory.go:223] Registration of the systemd container factory successfully Mar 18 16:44:17.485367 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.485341 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 18 16:44:17.485367 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.485359 2571 factory.go:103] Registering Raw factory Mar 18 16:44:17.485367 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.485341 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 18 16:44:17.485589 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.485379 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 18 16:44:17.485589 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.485385 2571 manager.go:1196] Started watching for new ooms in manager Mar 18 16:44:17.485589 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.485510 2571 reconstruct.go:97] "Volume reconstruction finished" Mar 18 16:44:17.485589 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.485526 2571 reconciler.go:26] "Reconciler: start to sync state" Mar 18 16:44:17.485998 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.485968 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-219.ec2.internal\" not found" Mar 18 16:44:17.486073 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.486050 2571 manager.go:319] Starting recovery of all containers Mar 18 16:44:17.486174 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.486144 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 18 16:44:17.486297 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.486247 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-219.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Mar 18 16:44:17.493348 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.493168 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-g8w29" Mar 18 16:44:17.496137 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.496085 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 18 16:44:17.496834 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.496799 2571 manager.go:324] Recovery completed Mar 18 16:44:17.501204 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.501187 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:17.503643 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.503627 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:17.503694 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.503660 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:17.503694 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.503675 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:17.504211 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.504195 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 18 16:44:17.504211 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.504206 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 18 16:44:17.504342 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.504222 2571 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:17.506385 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.506373 2571 policy_none.go:49] "None policy: Start" Mar 18 16:44:17.506422 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.506389 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 18 16:44:17.506422 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.506400 2571 state_mem.go:35] "Initializing new in-memory state store" Mar 18 16:44:17.544684 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.544658 2571 manager.go:341] "Starting Device Plugin manager" Mar 18 16:44:17.570464 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.544712 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 18 16:44:17.570464 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.544729 2571 server.go:85] "Starting device plugin registration server" Mar 18 16:44:17.570464 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.545005 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 16:44:17.570464 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.545017 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 16:44:17.570464 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.545575 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 16:44:17.570464 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.545676 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 16:44:17.570464 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.545685 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 16:44:17.570464 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.545850 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 18 16:44:17.570464 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.545890 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-219.ec2.internal\" not found" Mar 18 16:44:17.630120 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.630044 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 18 16:44:17.630120 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.630079 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 18 16:44:17.630120 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.630098 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 18 16:44:17.630120 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.630105 2571 kubelet.go:2451] "Starting kubelet main sync loop" Mar 18 16:44:17.630335 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.630138 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 18 16:44:17.632949 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.632921 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:17.646001 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.645971 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:17.646963 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.646944 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:17.647043 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.646975 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:17.647043 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.646987 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:17.647043 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.647009 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.658630 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.658606 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.658736 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.658633 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-219.ec2.internal\": node \"ip-10-0-137-219.ec2.internal\" not found" Mar 18 16:44:17.675651 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.675625 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-219.ec2.internal\" not found" Mar 18 16:44:17.730980 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.730930 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-219.ec2.internal"] Mar 18 16:44:17.731144 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.731051 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:17.732597 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.732579 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:17.732711 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.732617 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:17.732711 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.732638 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:17.735019 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.735007 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:17.735185 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.735172 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.735223 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.735200 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:17.735751 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.735734 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:17.735751 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.735745 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:17.735907 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.735767 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:17.735907 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.735768 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:17.735907 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.735783 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:17.735907 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.735781 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:17.738000 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.737986 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.738070 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.738010 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:17.738781 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.738766 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:17.738882 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.738799 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:17.738882 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.738811 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:17.770582 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.770557 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-219.ec2.internal\" not found" node="ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.775242 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.775228 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-219.ec2.internal\" not found" node="ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.775993 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.775981 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-219.ec2.internal\" not found" Mar 18 16:44:17.787180 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.787153 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4e24708ab1b7fb0b335c2f977303c6cf-config\") pod \"kube-apiserver-proxy-ip-10-0-137-219.ec2.internal\" (UID: \"4e24708ab1b7fb0b335c2f977303c6cf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.787288 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.787206 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7627a05b348819fc2dc1ab57705bc7c5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal\" (UID: \"7627a05b348819fc2dc1ab57705bc7c5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.787288 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.787253 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7627a05b348819fc2dc1ab57705bc7c5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal\" (UID: \"7627a05b348819fc2dc1ab57705bc7c5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.876833 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.876794 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-219.ec2.internal\" not found" Mar 18 16:44:17.888294 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.888234 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7627a05b348819fc2dc1ab57705bc7c5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal\" (UID: \"7627a05b348819fc2dc1ab57705bc7c5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.888294 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.888266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7627a05b348819fc2dc1ab57705bc7c5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal\" (UID: \"7627a05b348819fc2dc1ab57705bc7c5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.888294 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.888288 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4e24708ab1b7fb0b335c2f977303c6cf-config\") pod \"kube-apiserver-proxy-ip-10-0-137-219.ec2.internal\" (UID: \"4e24708ab1b7fb0b335c2f977303c6cf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.888449 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.888324 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4e24708ab1b7fb0b335c2f977303c6cf-config\") pod \"kube-apiserver-proxy-ip-10-0-137-219.ec2.internal\" (UID: \"4e24708ab1b7fb0b335c2f977303c6cf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.888449 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.888338 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7627a05b348819fc2dc1ab57705bc7c5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal\" (UID: \"7627a05b348819fc2dc1ab57705bc7c5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.888449 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:17.888341 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7627a05b348819fc2dc1ab57705bc7c5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal\" (UID: \"7627a05b348819fc2dc1ab57705bc7c5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal" Mar 18 16:44:17.977636 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:17.977592 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-219.ec2.internal\" not found" Mar 18 16:44:18.073188 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.073151 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal" Mar 18 16:44:18.076738 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.076720 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-219.ec2.internal" Mar 18 16:44:18.077803 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:18.077783 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-219.ec2.internal\" not found" Mar 18 16:44:18.178509 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:18.178420 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-219.ec2.internal\" not found" Mar 18 16:44:18.278986 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:18.278942 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-219.ec2.internal\" not found" Mar 18 16:44:18.377664 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.377640 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 16:44:18.378156 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.377781 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:44:18.379813 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:18.379784 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-219.ec2.internal\" not found" Mar 18 16:44:18.480862 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:18.480780 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-219.ec2.internal\" not found" Mar 18 16:44:18.484605 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.484588 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:18.495740 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.495702 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-03-17 16:39:17 +0000 UTC" deadline="2027-12-24 02:36:18.122925818 +0000 UTC" Mar 18 16:44:18.495740 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.495733 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15489h51m59.627196909s" Mar 18 16:44:18.500894 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.500865 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:18.524556 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.524523 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2jvpd" Mar 18 16:44:18.530379 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.530345 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2jvpd" Mar 18 16:44:18.581553 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:18.581513 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-219.ec2.internal\" not found" Mar 18 16:44:18.636944 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:18.636910 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e24708ab1b7fb0b335c2f977303c6cf.slice/crio-286a13a315f05fec48d62d51ab436a6137bbc7998ccb2842b1f78c204c042fc8 WatchSource:0}: Error finding container 286a13a315f05fec48d62d51ab436a6137bbc7998ccb2842b1f78c204c042fc8: Status 404 returned error can't find the container with id 286a13a315f05fec48d62d51ab436a6137bbc7998ccb2842b1f78c204c042fc8 Mar 18 16:44:18.637181 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:18.637156 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7627a05b348819fc2dc1ab57705bc7c5.slice/crio-6949333fe8ef488ac87de5cdcce0335999fbbec485b685ab7104e8eb42fd4b36 WatchSource:0}: Error finding container 6949333fe8ef488ac87de5cdcce0335999fbbec485b685ab7104e8eb42fd4b36: Status 404 returned error can't find the container with id 6949333fe8ef488ac87de5cdcce0335999fbbec485b685ab7104e8eb42fd4b36 Mar 18 16:44:18.640276 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.640259 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:44:18.642731 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.642709 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:18.685286 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.685257 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-219.ec2.internal" Mar 18 16:44:18.694609 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.694581 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:18.696197 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.696178 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal" Mar 18 16:44:18.709792 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.709766 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:18.790773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.790690 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:18.839849 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:18.839805 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:19.459256 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.459223 2571 apiserver.go:52] "Watching apiserver" Mar 18 16:44:19.465292 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.465264 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 18 16:44:19.467399 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.467370 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-4vbk6","kube-system/kube-apiserver-proxy-ip-10-0-137-219.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv","openshift-dns/node-resolver-zpjht","openshift-image-registry/node-ca-sv4zc","openshift-multus/network-metrics-daemon-mqv2m","openshift-network-operator/iptables-alerter-tl89r","openshift-ovn-kubernetes/ovnkube-node-wxnmv","openshift-cluster-node-tuning-operator/tuned-xss9q","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal","openshift-multus/multus-additional-cni-plugins-5rc5k","openshift-multus/multus-xn75z","openshift-network-diagnostics/network-check-target-n98cl"] Mar 18 16:44:19.470770 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.470743 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tl89r" Mar 18 16:44:19.472763 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.472738 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:19.473038 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.473019 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:19.473242 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.473225 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 18 16:44:19.473464 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.473444 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vgjtp\"" Mar 18 16:44:19.475618 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.475136 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zpjht" Mar 18 16:44:19.477645 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.477419 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 18 16:44:19.477754 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.477691 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.478430 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.478407 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bkbxc\"" Mar 18 16:44:19.478619 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.478602 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 18 16:44:19.479726 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.479705 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Mar 18 16:44:19.480122 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.480101 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kszpw\"" Mar 18 16:44:19.480298 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.480268 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Mar 18 16:44:19.480737 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.480554 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Mar 18 16:44:19.480872 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.480747 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:19.480872 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:19.480814 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:19.483406 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.483327 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sv4zc" Mar 18 16:44:19.483503 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.483429 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4vbk6" Mar 18 16:44:19.485192 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.485121 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 18 16:44:19.485366 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.485348 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 18 16:44:19.486508 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.486488 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Mar 18 16:44:19.486708 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.486693 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wld92\"" Mar 18 16:44:19.486757 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.486729 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 18 16:44:19.488231 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.486749 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-2vbv9\"" Mar 18 16:44:19.489745 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.489018 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Mar 18 16:44:19.490848 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.490257 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.494410 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.493135 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 18 16:44:19.494410 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.493352 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hhpw6\"" Mar 18 16:44:19.494410 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.493560 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 18 16:44:19.494410 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.493775 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 18 16:44:19.494410 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.494142 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 18 16:44:19.494410 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.494242 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 18 16:44:19.494410 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.494252 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 18 16:44:19.496567 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.496546 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:19.496669 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:19.496648 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:19.496803 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.496784 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.496975 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.496957 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.497226 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.497208 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a6ec8ca6-7754-4403-be16-a059d6c3184b-hosts-file\") pod \"node-resolver-zpjht\" (UID: \"a6ec8ca6-7754-4403-be16-a059d6c3184b\") " pod="openshift-dns/node-resolver-zpjht" Mar 18 16:44:19.497304 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.497252 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c30f7041-e680-4daa-8027-9cc3928f1d8e-konnectivity-ca\") pod \"konnectivity-agent-4vbk6\" (UID: \"c30f7041-e680-4daa-8027-9cc3928f1d8e\") " pod="kube-system/konnectivity-agent-4vbk6" Mar 18 16:44:19.497304 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.497289 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-var-lib-openvswitch\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.497401 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.497316 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-run-ovn-kubernetes\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.497401 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.497350 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-cni-netd\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.497401 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.497384 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.497548 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.497415 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-sys-fs\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.498429 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.497447 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a6ec8ca6-7754-4403-be16-a059d6c3184b-tmp-dir\") pod \"node-resolver-zpjht\" (UID: \"a6ec8ca6-7754-4403-be16-a059d6c3184b\") " pod="openshift-dns/node-resolver-zpjht" Mar 18 16:44:19.498429 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498105 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqzvd\" (UniqueName: \"kubernetes.io/projected/a6ec8ca6-7754-4403-be16-a059d6c3184b-kube-api-access-qqzvd\") pod \"node-resolver-zpjht\" (UID: \"a6ec8ca6-7754-4403-be16-a059d6c3184b\") " pod="openshift-dns/node-resolver-zpjht" Mar 18 16:44:19.498429 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498134 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-etc-openvswitch\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.498429 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498162 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-node-log\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.498429 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498202 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-log-socket\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.498429 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498253 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-cni-bin\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.498429 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498283 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-ovnkube-script-lib\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.498429 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498305 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-etc-selinux\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.498429 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498338 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.498429 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498381 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-env-overrides\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.498429 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498412 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-ovn-node-metrics-cert\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.498429 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498437 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ea30226-c756-46d3-b584-3613c9c03311-host\") pod \"node-ca-sv4zc\" (UID: \"8ea30226-c756-46d3-b584-3613c9c03311\") " pod="openshift-image-registry/node-ca-sv4zc" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498493 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd5dp\" (UniqueName: \"kubernetes.io/projected/8ea30226-c756-46d3-b584-3613c9c03311-kube-api-access-zd5dp\") pod \"node-ca-sv4zc\" (UID: \"8ea30226-c756-46d3-b584-3613c9c03311\") " pod="openshift-image-registry/node-ca-sv4zc" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498521 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccrnp\" (UniqueName: \"kubernetes.io/projected/268f643d-a996-4e11-b14c-70264a1426da-kube-api-access-ccrnp\") pod \"network-metrics-daemon-mqv2m\" (UID: \"268f643d-a996-4e11-b14c-70264a1426da\") " pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498548 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-systemd-units\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498581 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5kg7\" (UniqueName: \"kubernetes.io/projected/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-kube-api-access-f5kg7\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498604 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-registration-dir\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498626 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-kubelet\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498649 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-run-ovn\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498670 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-socket-dir\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498693 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5qcd\" (UniqueName: \"kubernetes.io/projected/bc25d175-d716-4c46-95bc-1e8f5c0f8681-kube-api-access-d5qcd\") pod \"iptables-alerter-tl89r\" (UID: \"bc25d175-d716-4c46-95bc-1e8f5c0f8681\") " pod="openshift-network-operator/iptables-alerter-tl89r" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498714 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26jvv\" (UniqueName: \"kubernetes.io/projected/3cac61f7-f970-490f-8068-ac685bccdb7e-kube-api-access-26jvv\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498736 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs\") pod \"network-metrics-daemon-mqv2m\" (UID: \"268f643d-a996-4e11-b14c-70264a1426da\") " pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498759 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bc25d175-d716-4c46-95bc-1e8f5c0f8681-host-slash\") pod \"iptables-alerter-tl89r\" (UID: \"bc25d175-d716-4c46-95bc-1e8f5c0f8681\") " pod="openshift-network-operator/iptables-alerter-tl89r" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498811 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-run-netns\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498863 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-run-systemd\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498886 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-run-openvswitch\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.499089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498913 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bc25d175-d716-4c46-95bc-1e8f5c0f8681-iptables-alerter-script\") pod \"iptables-alerter-tl89r\" (UID: \"bc25d175-d716-4c46-95bc-1e8f5c0f8681\") " pod="openshift-network-operator/iptables-alerter-tl89r" Mar 18 16:44:19.499945 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498932 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c30f7041-e680-4daa-8027-9cc3928f1d8e-agent-certs\") pod \"konnectivity-agent-4vbk6\" (UID: \"c30f7041-e680-4daa-8027-9cc3928f1d8e\") " pod="kube-system/konnectivity-agent-4vbk6" Mar 18 16:44:19.499945 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498948 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-slash\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.499945 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.498976 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-ovnkube-config\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.499945 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.499004 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-device-dir\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.499945 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.499041 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8ea30226-c756-46d3-b584-3613c9c03311-serviceca\") pod \"node-ca-sv4zc\" (UID: \"8ea30226-c756-46d3-b584-3613c9c03311\") " pod="openshift-image-registry/node-ca-sv4zc" Mar 18 16:44:19.500280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.500041 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.500280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.500074 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 18 16:44:19.500280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.500251 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 18 16:44:19.500523 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.500489 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nv6rb\"" Mar 18 16:44:19.500576 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.500528 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 18 16:44:19.500576 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.500478 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 18 16:44:19.500908 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.500760 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 18 16:44:19.501012 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.500946 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 18 16:44:19.502029 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.501093 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-t8cpw\"" Mar 18 16:44:19.503898 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.502685 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:19.503898 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.503050 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jlxmp\"" Mar 18 16:44:19.503898 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.503221 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:19.531092 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.531055 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:18 +0000 UTC" deadline="2027-08-31 01:58:26.312435519 +0000 UTC" Mar 18 16:44:19.531092 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.531090 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12729h14m6.781349945s" Mar 18 16:44:19.587244 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.587213 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 18 16:44:19.599625 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.599597 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-var-lib-cni-multus\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.599809 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.599648 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a6ec8ca6-7754-4403-be16-a059d6c3184b-hosts-file\") pod \"node-resolver-zpjht\" (UID: \"a6ec8ca6-7754-4403-be16-a059d6c3184b\") " pod="openshift-dns/node-resolver-zpjht" Mar 18 16:44:19.599809 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.599673 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c30f7041-e680-4daa-8027-9cc3928f1d8e-konnectivity-ca\") pod \"konnectivity-agent-4vbk6\" (UID: \"c30f7041-e680-4daa-8027-9cc3928f1d8e\") " pod="kube-system/konnectivity-agent-4vbk6" Mar 18 16:44:19.599809 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.599689 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-var-lib-openvswitch\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.599809 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.599705 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-run-ovn-kubernetes\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.599809 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.599786 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-cni-netd\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.600090 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.599813 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-sys-fs\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.600090 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.599859 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqzvd\" (UniqueName: \"kubernetes.io/projected/a6ec8ca6-7754-4403-be16-a059d6c3184b-kube-api-access-qqzvd\") pod \"node-resolver-zpjht\" (UID: \"a6ec8ca6-7754-4403-be16-a059d6c3184b\") " pod="openshift-dns/node-resolver-zpjht" Mar 18 16:44:19.600090 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.599875 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a6ec8ca6-7754-4403-be16-a059d6c3184b-hosts-file\") pod \"node-resolver-zpjht\" (UID: \"a6ec8ca6-7754-4403-be16-a059d6c3184b\") " pod="openshift-dns/node-resolver-zpjht" Mar 18 16:44:19.600090 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.599897 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-node-log\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.600090 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.599920 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-ovnkube-script-lib\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.600090 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.599945 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-etc-selinux\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.600090 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.599942 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-cni-netd\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.600090 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.599974 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-os-release\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.600090 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600052 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-var-lib-openvswitch\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.600090 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600063 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600101 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zd5dp\" (UniqueName: \"kubernetes.io/projected/8ea30226-c756-46d3-b584-3613c9c03311-kube-api-access-zd5dp\") pod \"node-ca-sv4zc\" (UID: \"8ea30226-c756-46d3-b584-3613c9c03311\") " pod="openshift-image-registry/node-ca-sv4zc" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600123 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-run-ovn-kubernetes\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600127 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/943aaf2c-0467-48a6-ac3c-29da845f8444-cnibin\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600168 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99e563f9-8ea2-45d6-99f0-5957e8b06150-tmp\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600203 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/943aaf2c-0467-48a6-ac3c-29da845f8444-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600261 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjsxj\" (UniqueName: \"kubernetes.io/projected/c15c5fb8-0b4e-452f-8ef8-120891ac0340-kube-api-access-cjsxj\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600293 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-run-ovn\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600341 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-socket-dir\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600398 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-etc-selinux\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600396 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-sysctl-conf\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600428 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkv88\" (UniqueName: \"kubernetes.io/projected/99e563f9-8ea2-45d6-99f0-5957e8b06150-kube-api-access-wkv88\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600460 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-node-log\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600477 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/943aaf2c-0467-48a6-ac3c-29da845f8444-cni-binary-copy\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600492 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c30f7041-e680-4daa-8027-9cc3928f1d8e-konnectivity-ca\") pod \"konnectivity-agent-4vbk6\" (UID: \"c30f7041-e680-4daa-8027-9cc3928f1d8e\") " pod="kube-system/konnectivity-agent-4vbk6" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600513 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs\") pod \"network-metrics-daemon-mqv2m\" (UID: \"268f643d-a996-4e11-b14c-70264a1426da\") " pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:19.600532 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600519 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-sys-fs\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600560 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-run-netns\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600616 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-run-ovn\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600599 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bc25d175-d716-4c46-95bc-1e8f5c0f8681-host-slash\") pod \"iptables-alerter-tl89r\" (UID: \"bc25d175-d716-4c46-95bc-1e8f5c0f8681\") " pod="openshift-network-operator/iptables-alerter-tl89r" Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:19.600663 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600717 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-run-netns\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600658 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-run-netns\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600751 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:19.600754 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs podName:268f643d-a996-4e11-b14c-70264a1426da nodeName:}" failed. No retries permitted until 2026-03-18 16:44:20.100719117 +0000 UTC m=+3.120379371 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs") pod "network-metrics-daemon-mqv2m" (UID: "268f643d-a996-4e11-b14c-70264a1426da") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600890 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-socket-dir\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.600666 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bc25d175-d716-4c46-95bc-1e8f5c0f8681-host-slash\") pod \"iptables-alerter-tl89r\" (UID: \"bc25d175-d716-4c46-95bc-1e8f5c0f8681\") " pod="openshift-network-operator/iptables-alerter-tl89r" Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601040 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-run-openvswitch\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601075 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-sys\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601104 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-multus-cni-dir\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601120 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-run-openvswitch\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601129 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c15c5fb8-0b4e-452f-8ef8-120891ac0340-cni-binary-copy\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601159 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bc25d175-d716-4c46-95bc-1e8f5c0f8681-iptables-alerter-script\") pod \"iptables-alerter-tl89r\" (UID: \"bc25d175-d716-4c46-95bc-1e8f5c0f8681\") " pod="openshift-network-operator/iptables-alerter-tl89r" Mar 18 16:44:19.601258 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601186 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-device-dir\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601212 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-modprobe-d\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601239 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-sysconfig\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601265 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/943aaf2c-0467-48a6-ac3c-29da845f8444-os-release\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601293 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-var-lib-cni-bin\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601319 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c15c5fb8-0b4e-452f-8ef8-120891ac0340-multus-daemon-config\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601349 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601377 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-run-multus-certs\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601404 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a6ec8ca6-7754-4403-be16-a059d6c3184b-tmp-dir\") pod \"node-resolver-zpjht\" (UID: \"a6ec8ca6-7754-4403-be16-a059d6c3184b\") " pod="openshift-dns/node-resolver-zpjht" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601428 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-etc-openvswitch\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601454 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-log-socket\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601492 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-cni-bin\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601518 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccrnp\" (UniqueName: \"kubernetes.io/projected/268f643d-a996-4e11-b14c-70264a1426da-kube-api-access-ccrnp\") pod \"network-metrics-daemon-mqv2m\" (UID: \"268f643d-a996-4e11-b14c-70264a1426da\") " pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601544 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-systemd\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601573 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-etc-kubernetes\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601721 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-etc-openvswitch\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601755 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-ovnkube-script-lib\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.602059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601762 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-cni-bin\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601781 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bc25d175-d716-4c46-95bc-1e8f5c0f8681-iptables-alerter-script\") pod \"iptables-alerter-tl89r\" (UID: \"bc25d175-d716-4c46-95bc-1e8f5c0f8681\") " pod="openshift-network-operator/iptables-alerter-tl89r" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601848 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-device-dir\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601874 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-env-overrides\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601889 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-log-socket\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601945 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-ovn-node-metrics-cert\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.601986 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a6ec8ca6-7754-4403-be16-a059d6c3184b-tmp-dir\") pod \"node-resolver-zpjht\" (UID: \"a6ec8ca6-7754-4403-be16-a059d6c3184b\") " pod="openshift-dns/node-resolver-zpjht" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602047 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ea30226-c756-46d3-b584-3613c9c03311-host\") pod \"node-ca-sv4zc\" (UID: \"8ea30226-c756-46d3-b584-3613c9c03311\") " pod="openshift-image-registry/node-ca-sv4zc" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602071 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ea30226-c756-46d3-b584-3613c9c03311-host\") pod \"node-ca-sv4zc\" (UID: \"8ea30226-c756-46d3-b584-3613c9c03311\") " pod="openshift-image-registry/node-ca-sv4zc" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/943aaf2c-0467-48a6-ac3c-29da845f8444-system-cni-dir\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602117 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-var-lib-kubelet\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602154 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-multus-conf-dir\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602226 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-systemd-units\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602279 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-systemd-units\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602365 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602396 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602413 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-env-overrides\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602412 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5kg7\" (UniqueName: \"kubernetes.io/projected/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-kube-api-access-f5kg7\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.602773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602530 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-registration-dir\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602561 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvg86\" (UniqueName: \"kubernetes.io/projected/943aaf2c-0467-48a6-ac3c-29da845f8444-kube-api-access-jvg86\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602579 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3cac61f7-f970-490f-8068-ac685bccdb7e-registration-dir\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602586 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-hostroot\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602626 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-kubelet\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602652 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfl6g\" (UniqueName: \"kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g\") pod \"network-check-target-n98cl\" (UID: \"5b14876c-8cf1-4df9-a7ba-bc940ec2ca85\") " pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602687 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-kubernetes\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602711 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-run\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602715 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-kubelet\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602735 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/943aaf2c-0467-48a6-ac3c-29da845f8444-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602760 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-system-cni-dir\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602804 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5qcd\" (UniqueName: \"kubernetes.io/projected/bc25d175-d716-4c46-95bc-1e8f5c0f8681-kube-api-access-d5qcd\") pod \"iptables-alerter-tl89r\" (UID: \"bc25d175-d716-4c46-95bc-1e8f5c0f8681\") " pod="openshift-network-operator/iptables-alerter-tl89r" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602858 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26jvv\" (UniqueName: \"kubernetes.io/projected/3cac61f7-f970-490f-8068-ac685bccdb7e-kube-api-access-26jvv\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602889 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-host\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602913 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-tuned\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602940 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-multus-socket-dir-parent\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602969 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-run-systemd\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.603575 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.602997 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-lib-modules\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.604329 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.603032 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-cnibin\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.604329 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.603058 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-run-k8s-cni-cncf-io\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.604329 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.603091 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c30f7041-e680-4daa-8027-9cc3928f1d8e-agent-certs\") pod \"konnectivity-agent-4vbk6\" (UID: \"c30f7041-e680-4daa-8027-9cc3928f1d8e\") " pod="kube-system/konnectivity-agent-4vbk6" Mar 18 16:44:19.604329 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.603115 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-slash\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.604329 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.603150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-ovnkube-config\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.604329 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.603174 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8ea30226-c756-46d3-b584-3613c9c03311-serviceca\") pod \"node-ca-sv4zc\" (UID: \"8ea30226-c756-46d3-b584-3613c9c03311\") " pod="openshift-image-registry/node-ca-sv4zc" Mar 18 16:44:19.604329 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.603214 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-sysctl-d\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.604329 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.603251 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-var-lib-kubelet\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.604329 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.603282 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/943aaf2c-0467-48a6-ac3c-29da845f8444-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.604329 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.603386 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-run-systemd\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.604329 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.603601 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-host-slash\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.604329 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.603960 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-ovnkube-config\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.606383 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.605936 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8ea30226-c756-46d3-b584-3613c9c03311-serviceca\") pod \"node-ca-sv4zc\" (UID: \"8ea30226-c756-46d3-b584-3613c9c03311\") " pod="openshift-image-registry/node-ca-sv4zc" Mar 18 16:44:19.609196 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.606515 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-ovn-node-metrics-cert\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.609196 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.606792 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c30f7041-e680-4daa-8027-9cc3928f1d8e-agent-certs\") pod \"konnectivity-agent-4vbk6\" (UID: \"c30f7041-e680-4daa-8027-9cc3928f1d8e\") " pod="kube-system/konnectivity-agent-4vbk6" Mar 18 16:44:19.613374 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.613283 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5qcd\" (UniqueName: \"kubernetes.io/projected/bc25d175-d716-4c46-95bc-1e8f5c0f8681-kube-api-access-d5qcd\") pod \"iptables-alerter-tl89r\" (UID: \"bc25d175-d716-4c46-95bc-1e8f5c0f8681\") " pod="openshift-network-operator/iptables-alerter-tl89r" Mar 18 16:44:19.613374 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.613305 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccrnp\" (UniqueName: \"kubernetes.io/projected/268f643d-a996-4e11-b14c-70264a1426da-kube-api-access-ccrnp\") pod \"network-metrics-daemon-mqv2m\" (UID: \"268f643d-a996-4e11-b14c-70264a1426da\") " pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:19.614017 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.613993 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5kg7\" (UniqueName: \"kubernetes.io/projected/d16fe930-746d-4ac7-b7e6-09d6a8b75d28-kube-api-access-f5kg7\") pod \"ovnkube-node-wxnmv\" (UID: \"d16fe930-746d-4ac7-b7e6-09d6a8b75d28\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.614193 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.614165 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqzvd\" (UniqueName: \"kubernetes.io/projected/a6ec8ca6-7754-4403-be16-a059d6c3184b-kube-api-access-qqzvd\") pod \"node-resolver-zpjht\" (UID: \"a6ec8ca6-7754-4403-be16-a059d6c3184b\") " pod="openshift-dns/node-resolver-zpjht" Mar 18 16:44:19.614912 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.614891 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd5dp\" (UniqueName: \"kubernetes.io/projected/8ea30226-c756-46d3-b584-3613c9c03311-kube-api-access-zd5dp\") pod \"node-ca-sv4zc\" (UID: \"8ea30226-c756-46d3-b584-3613c9c03311\") " pod="openshift-image-registry/node-ca-sv4zc" Mar 18 16:44:19.615950 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.615930 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26jvv\" (UniqueName: \"kubernetes.io/projected/3cac61f7-f970-490f-8068-ac685bccdb7e-kube-api-access-26jvv\") pod \"aws-ebs-csi-driver-node-msqzv\" (UID: \"3cac61f7-f970-490f-8068-ac685bccdb7e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.634077 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.634015 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-219.ec2.internal" event={"ID":"4e24708ab1b7fb0b335c2f977303c6cf","Type":"ContainerStarted","Data":"286a13a315f05fec48d62d51ab436a6137bbc7998ccb2842b1f78c204c042fc8"} Mar 18 16:44:19.635099 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.635066 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal" event={"ID":"7627a05b348819fc2dc1ab57705bc7c5","Type":"ContainerStarted","Data":"6949333fe8ef488ac87de5cdcce0335999fbbec485b685ab7104e8eb42fd4b36"} Mar 18 16:44:19.694928 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.694893 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:19.703620 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703584 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-os-release\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.703812 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703631 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/943aaf2c-0467-48a6-ac3c-29da845f8444-cnibin\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.703812 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703658 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99e563f9-8ea2-45d6-99f0-5957e8b06150-tmp\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.703812 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703684 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/943aaf2c-0467-48a6-ac3c-29da845f8444-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.703812 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703686 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-os-release\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.703812 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703710 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjsxj\" (UniqueName: \"kubernetes.io/projected/c15c5fb8-0b4e-452f-8ef8-120891ac0340-kube-api-access-cjsxj\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.703812 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703736 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/943aaf2c-0467-48a6-ac3c-29da845f8444-cnibin\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.703812 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703751 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-sysctl-conf\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.703812 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703777 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkv88\" (UniqueName: \"kubernetes.io/projected/99e563f9-8ea2-45d6-99f0-5957e8b06150-kube-api-access-wkv88\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.703812 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703794 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/943aaf2c-0467-48a6-ac3c-29da845f8444-cni-binary-copy\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703839 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-run-netns\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703867 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-sys\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703889 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-multus-cni-dir\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703911 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c15c5fb8-0b4e-452f-8ef8-120891ac0340-cni-binary-copy\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703937 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-modprobe-d\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703961 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-sysconfig\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.703983 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/943aaf2c-0467-48a6-ac3c-29da845f8444-os-release\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704003 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-var-lib-cni-bin\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704024 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c15c5fb8-0b4e-452f-8ef8-120891ac0340-multus-daemon-config\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704047 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-run-multus-certs\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704077 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-systemd\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704101 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-etc-kubernetes\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704110 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-var-lib-cni-bin\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704143 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/943aaf2c-0467-48a6-ac3c-29da845f8444-system-cni-dir\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704171 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-var-lib-kubelet\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704195 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-multus-conf-dir\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704224 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvg86\" (UniqueName: \"kubernetes.io/projected/943aaf2c-0467-48a6-ac3c-29da845f8444-kube-api-access-jvg86\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.704280 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-hostroot\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704274 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfl6g\" (UniqueName: \"kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g\") pod \"network-check-target-n98cl\" (UID: \"5b14876c-8cf1-4df9-a7ba-bc940ec2ca85\") " pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704300 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-kubernetes\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704326 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-run\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704360 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/943aaf2c-0467-48a6-ac3c-29da845f8444-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704372 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/943aaf2c-0467-48a6-ac3c-29da845f8444-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704385 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-system-cni-dir\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704412 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-host\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704437 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-tuned\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704443 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-run-multus-certs\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704462 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-multus-socket-dir-parent\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704483 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/943aaf2c-0467-48a6-ac3c-29da845f8444-cni-binary-copy\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704487 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-lib-modules\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704530 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-cnibin\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704559 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-run-k8s-cni-cncf-io\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-sysctl-d\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704605 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-lib-modules\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.705168 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704619 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-var-lib-kubelet\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704145 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-etc-kubernetes\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/943aaf2c-0467-48a6-ac3c-29da845f8444-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704656 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c15c5fb8-0b4e-452f-8ef8-120891ac0340-multus-daemon-config\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704076 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-sysctl-conf\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704744 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-sysconfig\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704787 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-modprobe-d\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704810 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/943aaf2c-0467-48a6-ac3c-29da845f8444-os-release\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704856 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-kubernetes\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704868 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-sys\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704892 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/943aaf2c-0467-48a6-ac3c-29da845f8444-system-cni-dir\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704910 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-systemd\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704912 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c15c5fb8-0b4e-452f-8ef8-120891ac0340-cni-binary-copy\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704929 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-var-lib-kubelet\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704950 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-run\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704965 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-var-lib-cni-multus\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704977 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-cnibin\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.704994 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-multus-socket-dir-parent\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.705883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.705014 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-multus-conf-dir\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.706516 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.705043 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-system-cni-dir\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.706516 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.705048 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-var-lib-cni-multus\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.706516 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.705082 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-var-lib-kubelet\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.706516 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.705102 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-sysctl-d\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.706516 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.705315 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-multus-cni-dir\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.706516 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.705367 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-run-netns\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.706516 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.705420 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99e563f9-8ea2-45d6-99f0-5957e8b06150-host\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.706516 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.705443 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-host-run-k8s-cni-cncf-io\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.706516 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.705476 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c15c5fb8-0b4e-452f-8ef8-120891ac0340-hostroot\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.706516 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.705520 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/943aaf2c-0467-48a6-ac3c-29da845f8444-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.706516 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.705682 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/943aaf2c-0467-48a6-ac3c-29da845f8444-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.707039 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.706521 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99e563f9-8ea2-45d6-99f0-5957e8b06150-tmp\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.707039 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.706850 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/99e563f9-8ea2-45d6-99f0-5957e8b06150-etc-tuned\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.712632 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:19.712569 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:19.712632 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:19.712590 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:19.712632 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:19.712603 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mfl6g for pod openshift-network-diagnostics/network-check-target-n98cl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:19.712887 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:19.712669 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g podName:5b14876c-8cf1-4df9-a7ba-bc940ec2ca85 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:20.212653144 +0000 UTC m=+3.232313393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mfl6g" (UniqueName: "kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g") pod "network-check-target-n98cl" (UID: "5b14876c-8cf1-4df9-a7ba-bc940ec2ca85") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:19.715005 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.714975 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkv88\" (UniqueName: \"kubernetes.io/projected/99e563f9-8ea2-45d6-99f0-5957e8b06150-kube-api-access-wkv88\") pod \"tuned-xss9q\" (UID: \"99e563f9-8ea2-45d6-99f0-5957e8b06150\") " pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:19.715270 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.715253 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvg86\" (UniqueName: \"kubernetes.io/projected/943aaf2c-0467-48a6-ac3c-29da845f8444-kube-api-access-jvg86\") pod \"multus-additional-cni-plugins-5rc5k\" (UID: \"943aaf2c-0467-48a6-ac3c-29da845f8444\") " pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.715270 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.715264 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjsxj\" (UniqueName: \"kubernetes.io/projected/c15c5fb8-0b4e-452f-8ef8-120891ac0340-kube-api-access-cjsxj\") pod \"multus-xn75z\" (UID: \"c15c5fb8-0b4e-452f-8ef8-120891ac0340\") " pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.796839 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.796789 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tl89r" Mar 18 16:44:19.808572 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.808540 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zpjht" Mar 18 16:44:19.817290 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.817264 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" Mar 18 16:44:19.823955 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.823917 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sv4zc" Mar 18 16:44:19.830656 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.830633 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4vbk6" Mar 18 16:44:19.839416 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.839391 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:19.847037 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.847013 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5rc5k" Mar 18 16:44:19.854690 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.854666 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xn75z" Mar 18 16:44:19.861362 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:19.861334 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xss9q" Mar 18 16:44:20.107869 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:20.107780 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs\") pod \"network-metrics-daemon-mqv2m\" (UID: \"268f643d-a996-4e11-b14c-70264a1426da\") " pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:20.107997 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:20.107919 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:20.107997 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:20.107980 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs podName:268f643d-a996-4e11-b14c-70264a1426da nodeName:}" failed. No retries permitted until 2026-03-18 16:44:21.10796443 +0000 UTC m=+4.127624678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs") pod "network-metrics-daemon-mqv2m" (UID: "268f643d-a996-4e11-b14c-70264a1426da") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:20.308880 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:20.308843 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfl6g\" (UniqueName: \"kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g\") pod \"network-check-target-n98cl\" (UID: \"5b14876c-8cf1-4df9-a7ba-bc940ec2ca85\") " pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:20.309034 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:20.309013 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:20.309070 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:20.309040 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:20.309070 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:20.309050 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mfl6g for pod openshift-network-diagnostics/network-check-target-n98cl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:20.309149 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:20.309105 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g podName:5b14876c-8cf1-4df9-a7ba-bc940ec2ca85 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:21.30909089 +0000 UTC m=+4.328751135 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mfl6g" (UniqueName: "kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g") pod "network-check-target-n98cl" (UID: "5b14876c-8cf1-4df9-a7ba-bc940ec2ca85") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:20.327009 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:20.326973 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod943aaf2c_0467_48a6_ac3c_29da845f8444.slice/crio-558f89104131035280694305a8e49fc0350f5f0bb4f2c79adf340e8270c7d676 WatchSource:0}: Error finding container 558f89104131035280694305a8e49fc0350f5f0bb4f2c79adf340e8270c7d676: Status 404 returned error can't find the container with id 558f89104131035280694305a8e49fc0350f5f0bb4f2c79adf340e8270c7d676 Mar 18 16:44:20.328765 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:20.328744 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99e563f9_8ea2_45d6_99f0_5957e8b06150.slice/crio-d40313e9a7dc53c6c0728d00ed88084ad6409eb94c0fbd9eebf04b964288859d WatchSource:0}: Error finding container d40313e9a7dc53c6c0728d00ed88084ad6409eb94c0fbd9eebf04b964288859d: Status 404 returned error can't find the container with id d40313e9a7dc53c6c0728d00ed88084ad6409eb94c0fbd9eebf04b964288859d Mar 18 16:44:20.331583 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:20.331561 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc15c5fb8_0b4e_452f_8ef8_120891ac0340.slice/crio-36025d13f528b472b86922b52092d8bf7e58344f6bb11107ebc0ab9546d83958 WatchSource:0}: Error finding container 36025d13f528b472b86922b52092d8bf7e58344f6bb11107ebc0ab9546d83958: Status 404 returned error can't find the container with id 36025d13f528b472b86922b52092d8bf7e58344f6bb11107ebc0ab9546d83958 Mar 18 16:44:20.332485 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:20.332463 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc25d175_d716_4c46_95bc_1e8f5c0f8681.slice/crio-3042369cd66da68094a6600b9df621a54ad079c3b47b9df2e16f12fb631e81b8 WatchSource:0}: Error finding container 3042369cd66da68094a6600b9df621a54ad079c3b47b9df2e16f12fb631e81b8: Status 404 returned error can't find the container with id 3042369cd66da68094a6600b9df621a54ad079c3b47b9df2e16f12fb631e81b8 Mar 18 16:44:20.353924 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:20.353893 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd16fe930_746d_4ac7_b7e6_09d6a8b75d28.slice/crio-e435573f115bceb7291d749e58f14045b365562cc73c54de004b647e914fb0a4 WatchSource:0}: Error finding container e435573f115bceb7291d749e58f14045b365562cc73c54de004b647e914fb0a4: Status 404 returned error can't find the container with id e435573f115bceb7291d749e58f14045b365562cc73c54de004b647e914fb0a4 Mar 18 16:44:20.354544 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:20.354518 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc30f7041_e680_4daa_8027_9cc3928f1d8e.slice/crio-e1740ab5c290c2eb0d9486f044c1f420c648a2cf89cd6be5178c75d6abf91d37 WatchSource:0}: Error finding container e1740ab5c290c2eb0d9486f044c1f420c648a2cf89cd6be5178c75d6abf91d37: Status 404 returned error can't find the container with id e1740ab5c290c2eb0d9486f044c1f420c648a2cf89cd6be5178c75d6abf91d37 Mar 18 16:44:20.355182 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:20.355132 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ea30226_c756_46d3_b584_3613c9c03311.slice/crio-3e68273129196f4e8827a4c7e1cd96bd5162892ae5e012fd571223c5c78f3b54 WatchSource:0}: Error finding container 3e68273129196f4e8827a4c7e1cd96bd5162892ae5e012fd571223c5c78f3b54: Status 404 returned error can't find the container with id 3e68273129196f4e8827a4c7e1cd96bd5162892ae5e012fd571223c5c78f3b54 Mar 18 16:44:20.356627 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:20.356516 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ec8ca6_7754_4403_be16_a059d6c3184b.slice/crio-2d2712fae62d6a56be8cf038534f5bf5c2bff01c351a0e31ee3a857ff887c914 WatchSource:0}: Error finding container 2d2712fae62d6a56be8cf038534f5bf5c2bff01c351a0e31ee3a857ff887c914: Status 404 returned error can't find the container with id 2d2712fae62d6a56be8cf038534f5bf5c2bff01c351a0e31ee3a857ff887c914 Mar 18 16:44:20.357619 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:20.357598 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cac61f7_f970_490f_8068_ac685bccdb7e.slice/crio-3902950a812e1afb59c2d57aa45102dca8cd94ad83a51299fa80dcc5b553d4c0 WatchSource:0}: Error finding container 3902950a812e1afb59c2d57aa45102dca8cd94ad83a51299fa80dcc5b553d4c0: Status 404 returned error can't find the container with id 3902950a812e1afb59c2d57aa45102dca8cd94ad83a51299fa80dcc5b553d4c0 Mar 18 16:44:20.532089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:20.531903 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:18 +0000 UTC" deadline="2027-12-24 23:33:47.514981046 +0000 UTC" Mar 18 16:44:20.532089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:20.532080 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15510h49m26.982905747s" Mar 18 16:44:20.637912 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:20.637778 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tl89r" event={"ID":"bc25d175-d716-4c46-95bc-1e8f5c0f8681","Type":"ContainerStarted","Data":"3042369cd66da68094a6600b9df621a54ad079c3b47b9df2e16f12fb631e81b8"} Mar 18 16:44:20.639343 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:20.639322 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xn75z" event={"ID":"c15c5fb8-0b4e-452f-8ef8-120891ac0340","Type":"ContainerStarted","Data":"36025d13f528b472b86922b52092d8bf7e58344f6bb11107ebc0ab9546d83958"} Mar 18 16:44:20.640867 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:20.640835 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-219.ec2.internal" event={"ID":"4e24708ab1b7fb0b335c2f977303c6cf","Type":"ContainerStarted","Data":"16271b7967d9c2df44da1abaa41accda6f2f01ec251640d8bfbdac27ef15bc5b"} Mar 18 16:44:20.641997 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:20.641973 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" event={"ID":"3cac61f7-f970-490f-8068-ac685bccdb7e","Type":"ContainerStarted","Data":"3902950a812e1afb59c2d57aa45102dca8cd94ad83a51299fa80dcc5b553d4c0"} Mar 18 16:44:20.643107 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:20.643082 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4vbk6" event={"ID":"c30f7041-e680-4daa-8027-9cc3928f1d8e","Type":"ContainerStarted","Data":"e1740ab5c290c2eb0d9486f044c1f420c648a2cf89cd6be5178c75d6abf91d37"} Mar 18 16:44:20.644309 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:20.644287 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" event={"ID":"d16fe930-746d-4ac7-b7e6-09d6a8b75d28","Type":"ContainerStarted","Data":"e435573f115bceb7291d749e58f14045b365562cc73c54de004b647e914fb0a4"} Mar 18 16:44:20.645392 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:20.645368 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xss9q" event={"ID":"99e563f9-8ea2-45d6-99f0-5957e8b06150","Type":"ContainerStarted","Data":"d40313e9a7dc53c6c0728d00ed88084ad6409eb94c0fbd9eebf04b964288859d"} Mar 18 16:44:20.646344 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:20.646322 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rc5k" event={"ID":"943aaf2c-0467-48a6-ac3c-29da845f8444","Type":"ContainerStarted","Data":"558f89104131035280694305a8e49fc0350f5f0bb4f2c79adf340e8270c7d676"} Mar 18 16:44:20.647392 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:20.647371 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zpjht" event={"ID":"a6ec8ca6-7754-4403-be16-a059d6c3184b","Type":"ContainerStarted","Data":"2d2712fae62d6a56be8cf038534f5bf5c2bff01c351a0e31ee3a857ff887c914"} Mar 18 16:44:20.648562 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:20.648540 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sv4zc" event={"ID":"8ea30226-c756-46d3-b584-3613c9c03311","Type":"ContainerStarted","Data":"3e68273129196f4e8827a4c7e1cd96bd5162892ae5e012fd571223c5c78f3b54"} Mar 18 16:44:21.116362 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:21.116006 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs\") pod \"network-metrics-daemon-mqv2m\" (UID: \"268f643d-a996-4e11-b14c-70264a1426da\") " pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:21.116362 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:21.116208 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:21.117056 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:21.117034 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs podName:268f643d-a996-4e11-b14c-70264a1426da nodeName:}" failed. No retries permitted until 2026-03-18 16:44:23.116254433 +0000 UTC m=+6.135914683 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs") pod "network-metrics-daemon-mqv2m" (UID: "268f643d-a996-4e11-b14c-70264a1426da") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:21.318347 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:21.318309 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfl6g\" (UniqueName: \"kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g\") pod \"network-check-target-n98cl\" (UID: \"5b14876c-8cf1-4df9-a7ba-bc940ec2ca85\") " pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:21.318547 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:21.318527 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:21.318623 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:21.318552 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:21.318623 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:21.318566 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mfl6g for pod openshift-network-diagnostics/network-check-target-n98cl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:21.318718 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:21.318625 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g podName:5b14876c-8cf1-4df9-a7ba-bc940ec2ca85 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:23.318606637 +0000 UTC m=+6.338266889 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mfl6g" (UniqueName: "kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g") pod "network-check-target-n98cl" (UID: "5b14876c-8cf1-4df9-a7ba-bc940ec2ca85") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:21.631268 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:21.631177 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:21.631711 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:21.631313 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:21.631770 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:21.631743 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:21.631884 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:21.631862 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:21.655205 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:21.655169 2571 generic.go:358] "Generic (PLEG): container finished" podID="7627a05b348819fc2dc1ab57705bc7c5" containerID="b990329ccc49d77d5ae215a8eae9bf0b620e050a70f8cf8a57c1809ca7e7cdcd" exitCode=0 Mar 18 16:44:21.656151 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:21.656127 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal" event={"ID":"7627a05b348819fc2dc1ab57705bc7c5","Type":"ContainerDied","Data":"b990329ccc49d77d5ae215a8eae9bf0b620e050a70f8cf8a57c1809ca7e7cdcd"} Mar 18 16:44:21.672323 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:21.672215 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-219.ec2.internal" podStartSLOduration=3.67219667 podStartE2EDuration="3.67219667s" podCreationTimestamp="2026-03-18 16:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:20.653764886 +0000 UTC m=+3.673425154" watchObservedRunningTime="2026-03-18 16:44:21.67219667 +0000 UTC m=+4.691856952" Mar 18 16:44:22.665033 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:22.664995 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal" event={"ID":"7627a05b348819fc2dc1ab57705bc7c5","Type":"ContainerStarted","Data":"ba1a6756dcf5d39860ecf3e283d415b906cac00c94ffb621cd2ef05717813de4"} Mar 18 16:44:23.134661 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:23.134015 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs\") pod \"network-metrics-daemon-mqv2m\" (UID: \"268f643d-a996-4e11-b14c-70264a1426da\") " pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:23.134661 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:23.134184 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:23.134661 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:23.134250 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs podName:268f643d-a996-4e11-b14c-70264a1426da nodeName:}" failed. No retries permitted until 2026-03-18 16:44:27.134230143 +0000 UTC m=+10.153890393 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs") pod "network-metrics-daemon-mqv2m" (UID: "268f643d-a996-4e11-b14c-70264a1426da") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:23.336359 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:23.336307 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfl6g\" (UniqueName: \"kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g\") pod \"network-check-target-n98cl\" (UID: \"5b14876c-8cf1-4df9-a7ba-bc940ec2ca85\") " pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:23.336604 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:23.336520 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:23.336604 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:23.336542 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:23.336604 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:23.336555 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mfl6g for pod openshift-network-diagnostics/network-check-target-n98cl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:23.336769 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:23.336640 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g podName:5b14876c-8cf1-4df9-a7ba-bc940ec2ca85 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:27.336620519 +0000 UTC m=+10.356280770 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mfl6g" (UniqueName: "kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g") pod "network-check-target-n98cl" (UID: "5b14876c-8cf1-4df9-a7ba-bc940ec2ca85") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:23.632550 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:23.631950 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:23.632550 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:23.632098 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:23.632986 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:23.632845 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:23.632986 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:23.632947 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:25.634303 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:25.634274 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:25.634754 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:25.634394 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:25.634754 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:25.634484 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:25.634754 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:25.634561 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:27.168353 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:27.168304 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs\") pod \"network-metrics-daemon-mqv2m\" (UID: \"268f643d-a996-4e11-b14c-70264a1426da\") " pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:27.168854 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:27.168508 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:27.168854 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:27.168588 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs podName:268f643d-a996-4e11-b14c-70264a1426da nodeName:}" failed. No retries permitted until 2026-03-18 16:44:35.168568058 +0000 UTC m=+18.188228310 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs") pod "network-metrics-daemon-mqv2m" (UID: "268f643d-a996-4e11-b14c-70264a1426da") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:27.370685 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:27.370636 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfl6g\" (UniqueName: \"kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g\") pod \"network-check-target-n98cl\" (UID: \"5b14876c-8cf1-4df9-a7ba-bc940ec2ca85\") " pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:27.370929 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:27.370816 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:27.370929 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:27.370850 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:27.370929 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:27.370862 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mfl6g for pod openshift-network-diagnostics/network-check-target-n98cl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:27.370929 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:27.370920 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g podName:5b14876c-8cf1-4df9-a7ba-bc940ec2ca85 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:35.370902655 +0000 UTC m=+18.390562906 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mfl6g" (UniqueName: "kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g") pod "network-check-target-n98cl" (UID: "5b14876c-8cf1-4df9-a7ba-bc940ec2ca85") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:27.632300 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:27.631544 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:27.632300 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:27.631673 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:27.632300 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:27.632116 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:27.632300 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:27.632210 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:29.630986 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:29.630634 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:29.630986 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:29.630643 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:29.630986 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:29.630759 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:29.630986 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:29.630799 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:31.631185 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:31.631153 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:31.631615 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:31.631196 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:31.631615 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:31.631266 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:31.631615 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:31.631398 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:33.631037 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:33.631000 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:33.631505 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:33.631041 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:33.631505 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:33.631144 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:33.631505 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:33.631271 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:35.221936 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:35.221866 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs\") pod \"network-metrics-daemon-mqv2m\" (UID: \"268f643d-a996-4e11-b14c-70264a1426da\") " pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:35.222386 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:35.222048 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:35.222386 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:35.222112 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs podName:268f643d-a996-4e11-b14c-70264a1426da nodeName:}" failed. No retries permitted until 2026-03-18 16:44:51.222097439 +0000 UTC m=+34.241757683 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs") pod "network-metrics-daemon-mqv2m" (UID: "268f643d-a996-4e11-b14c-70264a1426da") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:35.423929 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:35.423892 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfl6g\" (UniqueName: \"kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g\") pod \"network-check-target-n98cl\" (UID: \"5b14876c-8cf1-4df9-a7ba-bc940ec2ca85\") " pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:35.424127 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:35.424046 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:35.424127 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:35.424064 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:35.424127 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:35.424074 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mfl6g for pod openshift-network-diagnostics/network-check-target-n98cl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:35.424250 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:35.424135 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g podName:5b14876c-8cf1-4df9-a7ba-bc940ec2ca85 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:51.42412182 +0000 UTC m=+34.443782069 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mfl6g" (UniqueName: "kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g") pod "network-check-target-n98cl" (UID: "5b14876c-8cf1-4df9-a7ba-bc940ec2ca85") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:35.630615 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:35.630531 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:35.630615 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:35.630581 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:35.630856 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:35.630679 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:35.630856 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:35.630839 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:36.708448 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:36.708384 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-219.ec2.internal" podStartSLOduration=18.708363161 podStartE2EDuration="18.708363161s" podCreationTimestamp="2026-03-18 16:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:22.678236377 +0000 UTC m=+5.697896645" watchObservedRunningTime="2026-03-18 16:44:36.708363161 +0000 UTC m=+19.728023430" Mar 18 16:44:36.708937 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:36.708729 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-l26p8"] Mar 18 16:44:36.784163 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:36.784125 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:36.784334 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:36.784213 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l26p8" podUID="263447b9-97dc-43ab-bd1c-1fc05a93e62a" Mar 18 16:44:36.832874 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:36.832839 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/263447b9-97dc-43ab-bd1c-1fc05a93e62a-dbus\") pod \"global-pull-secret-syncer-l26p8\" (UID: \"263447b9-97dc-43ab-bd1c-1fc05a93e62a\") " pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:36.832874 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:36.832875 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret\") pod \"global-pull-secret-syncer-l26p8\" (UID: \"263447b9-97dc-43ab-bd1c-1fc05a93e62a\") " pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:36.833042 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:36.832982 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/263447b9-97dc-43ab-bd1c-1fc05a93e62a-kubelet-config\") pod \"global-pull-secret-syncer-l26p8\" (UID: \"263447b9-97dc-43ab-bd1c-1fc05a93e62a\") " pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:36.933411 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:36.933375 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/263447b9-97dc-43ab-bd1c-1fc05a93e62a-kubelet-config\") pod \"global-pull-secret-syncer-l26p8\" (UID: \"263447b9-97dc-43ab-bd1c-1fc05a93e62a\") " pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:36.933596 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:36.933440 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/263447b9-97dc-43ab-bd1c-1fc05a93e62a-dbus\") pod \"global-pull-secret-syncer-l26p8\" (UID: \"263447b9-97dc-43ab-bd1c-1fc05a93e62a\") " pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:36.933596 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:36.933467 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret\") pod \"global-pull-secret-syncer-l26p8\" (UID: \"263447b9-97dc-43ab-bd1c-1fc05a93e62a\") " pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:36.933596 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:36.933494 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/263447b9-97dc-43ab-bd1c-1fc05a93e62a-kubelet-config\") pod \"global-pull-secret-syncer-l26p8\" (UID: \"263447b9-97dc-43ab-bd1c-1fc05a93e62a\") " pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:36.933756 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:36.933600 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:36.933756 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:36.933652 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/263447b9-97dc-43ab-bd1c-1fc05a93e62a-dbus\") pod \"global-pull-secret-syncer-l26p8\" (UID: \"263447b9-97dc-43ab-bd1c-1fc05a93e62a\") " pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:36.933756 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:36.933668 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret podName:263447b9-97dc-43ab-bd1c-1fc05a93e62a nodeName:}" failed. No retries permitted until 2026-03-18 16:44:37.433648565 +0000 UTC m=+20.453308824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret") pod "global-pull-secret-syncer-l26p8" (UID: "263447b9-97dc-43ab-bd1c-1fc05a93e62a") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:37.438849 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.438789 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret\") pod \"global-pull-secret-syncer-l26p8\" (UID: \"263447b9-97dc-43ab-bd1c-1fc05a93e62a\") " pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:37.438973 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:37.438955 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:37.439031 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:37.439019 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret podName:263447b9-97dc-43ab-bd1c-1fc05a93e62a nodeName:}" failed. No retries permitted until 2026-03-18 16:44:38.439002058 +0000 UTC m=+21.458662320 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret") pod "global-pull-secret-syncer-l26p8" (UID: "263447b9-97dc-43ab-bd1c-1fc05a93e62a") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:37.634397 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.634173 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:37.634496 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.634408 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:37.634534 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:37.634514 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:37.634638 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:37.634621 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:37.692489 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.692445 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xn75z" event={"ID":"c15c5fb8-0b4e-452f-8ef8-120891ac0340","Type":"ContainerStarted","Data":"87caf339d22de244a1155b4187d8866ec0e52211aecad633112c52ca1a9ecc47"} Mar 18 16:44:37.694260 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.694223 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" event={"ID":"3cac61f7-f970-490f-8068-ac685bccdb7e","Type":"ContainerStarted","Data":"9d93b8c04d28defb5fd2266119599c12014f350080bd3cc9e469cfc664f23fe0"} Mar 18 16:44:37.696108 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.696084 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4vbk6" event={"ID":"c30f7041-e680-4daa-8027-9cc3928f1d8e","Type":"ContainerStarted","Data":"e896bc5072f5aed6a3aa5c261d5e6e1ed04365182226adb6784c97627461524e"} Mar 18 16:44:37.697629 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.697601 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" event={"ID":"d16fe930-746d-4ac7-b7e6-09d6a8b75d28","Type":"ContainerStarted","Data":"bcb971bb8dc07518fd4f8310b9e1b5e69d707bc8da804331985de126e9adaee7"} Mar 18 16:44:37.697718 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.697638 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" event={"ID":"d16fe930-746d-4ac7-b7e6-09d6a8b75d28","Type":"ContainerStarted","Data":"4e2dbf929c1e7bd0d0d6c65f8fe62d0a9b89b6ae79f90b0b53d64957c89bb090"} Mar 18 16:44:37.698994 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.698922 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xss9q" event={"ID":"99e563f9-8ea2-45d6-99f0-5957e8b06150","Type":"ContainerStarted","Data":"d89c5c3b4b63741797b884c26c9ae473169ed12fbd5653aa86751c02cab6ef3e"} Mar 18 16:44:37.700516 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.700489 2571 generic.go:358] "Generic (PLEG): container finished" podID="943aaf2c-0467-48a6-ac3c-29da845f8444" containerID="20b767f995e5f7150f2da0499ea061df3c73993c0d7dc0479784116c3bd283b3" exitCode=0 Mar 18 16:44:37.700583 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.700556 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rc5k" event={"ID":"943aaf2c-0467-48a6-ac3c-29da845f8444","Type":"ContainerDied","Data":"20b767f995e5f7150f2da0499ea061df3c73993c0d7dc0479784116c3bd283b3"} Mar 18 16:44:37.705718 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.705659 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zpjht" event={"ID":"a6ec8ca6-7754-4403-be16-a059d6c3184b","Type":"ContainerStarted","Data":"3f7ae20dc228a9b64c6763fd076a28eefa75ddd1e3559c2b5f8b4af1ced83ea0"} Mar 18 16:44:37.721409 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.721378 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sv4zc" event={"ID":"8ea30226-c756-46d3-b584-3613c9c03311","Type":"ContainerStarted","Data":"e2a06272f5e4d033c542f6e36873dba8dae8cdf7fccc4e54cf89144fac5eb48f"} Mar 18 16:44:37.735871 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.735812 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xn75z" podStartSLOduration=3.794677617 podStartE2EDuration="20.73579691s" podCreationTimestamp="2026-03-18 16:44:17 +0000 UTC" firstStartedPulling="2026-03-18 16:44:20.352257821 +0000 UTC m=+3.371918080" lastFinishedPulling="2026-03-18 16:44:37.293377124 +0000 UTC m=+20.313037373" observedRunningTime="2026-03-18 16:44:37.719726041 +0000 UTC m=+20.739386305" watchObservedRunningTime="2026-03-18 16:44:37.73579691 +0000 UTC m=+20.755457176" Mar 18 16:44:37.761198 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.761083 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xss9q" podStartSLOduration=2.804316311 podStartE2EDuration="19.761066114s" podCreationTimestamp="2026-03-18 16:44:18 +0000 UTC" firstStartedPulling="2026-03-18 16:44:20.330677014 +0000 UTC m=+3.350337261" lastFinishedPulling="2026-03-18 16:44:37.287426804 +0000 UTC m=+20.307087064" observedRunningTime="2026-03-18 16:44:37.749447477 +0000 UTC m=+20.769107745" watchObservedRunningTime="2026-03-18 16:44:37.761066114 +0000 UTC m=+20.780726382" Mar 18 16:44:37.761516 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.761488 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zpjht" podStartSLOduration=3.832609469 podStartE2EDuration="20.761482273s" podCreationTimestamp="2026-03-18 16:44:17 +0000 UTC" firstStartedPulling="2026-03-18 16:44:20.360694111 +0000 UTC m=+3.380354355" lastFinishedPulling="2026-03-18 16:44:37.289566911 +0000 UTC m=+20.309227159" observedRunningTime="2026-03-18 16:44:37.760898734 +0000 UTC m=+20.780559001" watchObservedRunningTime="2026-03-18 16:44:37.761482273 +0000 UTC m=+20.781142540" Mar 18 16:44:37.772484 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.772434 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4vbk6" podStartSLOduration=3.841336724 podStartE2EDuration="20.772419397s" podCreationTimestamp="2026-03-18 16:44:17 +0000 UTC" firstStartedPulling="2026-03-18 16:44:20.356304148 +0000 UTC m=+3.375964407" lastFinishedPulling="2026-03-18 16:44:37.287386827 +0000 UTC m=+20.307047080" observedRunningTime="2026-03-18 16:44:37.772322794 +0000 UTC m=+20.791983061" watchObservedRunningTime="2026-03-18 16:44:37.772419397 +0000 UTC m=+20.792079664" Mar 18 16:44:37.794364 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:37.794317 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sv4zc" podStartSLOduration=3.947402915 podStartE2EDuration="20.794302415s" podCreationTimestamp="2026-03-18 16:44:17 +0000 UTC" firstStartedPulling="2026-03-18 16:44:20.360591021 +0000 UTC m=+3.380251269" lastFinishedPulling="2026-03-18 16:44:37.207490507 +0000 UTC m=+20.227150769" observedRunningTime="2026-03-18 16:44:37.79417766 +0000 UTC m=+20.813837927" watchObservedRunningTime="2026-03-18 16:44:37.794302415 +0000 UTC m=+20.813962682" Mar 18 16:44:38.447092 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:38.446799 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret\") pod \"global-pull-secret-syncer-l26p8\" (UID: \"263447b9-97dc-43ab-bd1c-1fc05a93e62a\") " pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:38.447092 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:38.446946 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:38.447319 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:38.447170 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret podName:263447b9-97dc-43ab-bd1c-1fc05a93e62a nodeName:}" failed. No retries permitted until 2026-03-18 16:44:40.447150575 +0000 UTC m=+23.466810822 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret") pod "global-pull-secret-syncer-l26p8" (UID: "263447b9-97dc-43ab-bd1c-1fc05a93e62a") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:38.630767 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:38.630736 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:38.630950 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:38.630882 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l26p8" podUID="263447b9-97dc-43ab-bd1c-1fc05a93e62a" Mar 18 16:44:38.726225 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:38.726195 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 16:44:38.726857 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:38.726527 2571 generic.go:358] "Generic (PLEG): container finished" podID="d16fe930-746d-4ac7-b7e6-09d6a8b75d28" containerID="bcb971bb8dc07518fd4f8310b9e1b5e69d707bc8da804331985de126e9adaee7" exitCode=1 Mar 18 16:44:38.726857 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:38.726624 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" event={"ID":"d16fe930-746d-4ac7-b7e6-09d6a8b75d28","Type":"ContainerDied","Data":"bcb971bb8dc07518fd4f8310b9e1b5e69d707bc8da804331985de126e9adaee7"} Mar 18 16:44:38.726857 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:38.726660 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" event={"ID":"d16fe930-746d-4ac7-b7e6-09d6a8b75d28","Type":"ContainerStarted","Data":"0a12a01762643b275244b8f6832aec6dfaa431e9094dca36733ed00bbc74ae70"} Mar 18 16:44:38.726857 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:38.726674 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" event={"ID":"d16fe930-746d-4ac7-b7e6-09d6a8b75d28","Type":"ContainerStarted","Data":"e285f26d79ff5c797c7cd1575dd0aa127e2c13b17a0a3fc6d95ab7ad0b961e56"} Mar 18 16:44:38.726857 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:38.726686 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" event={"ID":"d16fe930-746d-4ac7-b7e6-09d6a8b75d28","Type":"ContainerStarted","Data":"1eeda48ff662882cff49be6e260450d1707d66279c9b48a83108983e15548f47"} Mar 18 16:44:38.726857 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:38.726709 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" event={"ID":"d16fe930-746d-4ac7-b7e6-09d6a8b75d28","Type":"ContainerStarted","Data":"1ed3a83208b843b60da254fa9bce56f404fafd6870ef1087b6e709aae7ecc7b5"} Mar 18 16:44:39.002669 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:39.002638 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Mar 18 16:44:39.556385 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:39.556261 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-03-18T16:44:39.002663192Z","UUID":"0d6b24ca-1286-4188-9799-cbdd9ae524d3","Handler":null,"Name":"","Endpoint":""} Mar 18 16:44:39.558113 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:39.558088 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Mar 18 16:44:39.558243 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:39.558122 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Mar 18 16:44:39.631006 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:39.630974 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:39.631164 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:39.631018 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:39.631273 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:39.631228 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:39.631273 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:39.631247 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:39.730219 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:39.729950 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tl89r" event={"ID":"bc25d175-d716-4c46-95bc-1e8f5c0f8681","Type":"ContainerStarted","Data":"925970bb6469c53c043dcb46ebaca44ca0449a50ef8dec92d72e63eb44b74efd"} Mar 18 16:44:39.732550 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:39.732512 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" event={"ID":"3cac61f7-f970-490f-8068-ac685bccdb7e","Type":"ContainerStarted","Data":"9f4f7c88f5b241e4febfd5ed3dc003ab4d40e79f92ea1130ba45dff0c5e70d7b"} Mar 18 16:44:39.744917 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:39.744877 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-tl89r" podStartSLOduration=5.839766402 podStartE2EDuration="22.744864486s" podCreationTimestamp="2026-03-18 16:44:17 +0000 UTC" firstStartedPulling="2026-03-18 16:44:20.352303085 +0000 UTC m=+3.371963337" lastFinishedPulling="2026-03-18 16:44:37.25740117 +0000 UTC m=+20.277061421" observedRunningTime="2026-03-18 16:44:39.744563922 +0000 UTC m=+22.764224191" watchObservedRunningTime="2026-03-18 16:44:39.744864486 +0000 UTC m=+22.764524753" Mar 18 16:44:40.461255 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:40.461220 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret\") pod \"global-pull-secret-syncer-l26p8\" (UID: \"263447b9-97dc-43ab-bd1c-1fc05a93e62a\") " pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:40.461466 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:40.461341 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:40.461466 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:40.461412 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret podName:263447b9-97dc-43ab-bd1c-1fc05a93e62a nodeName:}" failed. No retries permitted until 2026-03-18 16:44:44.461398912 +0000 UTC m=+27.481059156 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret") pod "global-pull-secret-syncer-l26p8" (UID: "263447b9-97dc-43ab-bd1c-1fc05a93e62a") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:40.630529 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:40.630325 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:40.630691 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:40.630551 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l26p8" podUID="263447b9-97dc-43ab-bd1c-1fc05a93e62a" Mar 18 16:44:40.736724 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:40.736679 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" event={"ID":"3cac61f7-f970-490f-8068-ac685bccdb7e","Type":"ContainerStarted","Data":"9550cfa77fea4e548c402aabddbab5ed2c4f1c9bd81da4d767a3fec194b293d3"} Mar 18 16:44:40.739722 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:40.739676 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 16:44:40.740074 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:40.740052 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" event={"ID":"d16fe930-746d-4ac7-b7e6-09d6a8b75d28","Type":"ContainerStarted","Data":"02e3e180b34110d489d11f8f978d704d0fed1b67f3bbfb472c3a1821ac9e0146"} Mar 18 16:44:40.754432 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:40.754389 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msqzv" podStartSLOduration=4.102184791 podStartE2EDuration="23.754374754s" podCreationTimestamp="2026-03-18 16:44:17 +0000 UTC" firstStartedPulling="2026-03-18 16:44:20.360465241 +0000 UTC m=+3.380125486" lastFinishedPulling="2026-03-18 16:44:40.012655194 +0000 UTC m=+23.032315449" observedRunningTime="2026-03-18 16:44:40.754283493 +0000 UTC m=+23.773943759" watchObservedRunningTime="2026-03-18 16:44:40.754374754 +0000 UTC m=+23.774035021" Mar 18 16:44:41.414479 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:41.414437 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4vbk6" Mar 18 16:44:41.415473 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:41.415453 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4vbk6" Mar 18 16:44:41.631241 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:41.631206 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:41.631408 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:41.631206 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:41.631408 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:41.631352 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:41.631493 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:41.631423 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:41.742518 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:41.742446 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4vbk6" Mar 18 16:44:41.742958 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:41.742938 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4vbk6" Mar 18 16:44:42.630982 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:42.630804 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:42.631113 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:42.631061 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l26p8" podUID="263447b9-97dc-43ab-bd1c-1fc05a93e62a" Mar 18 16:44:42.746978 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:42.746951 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 16:44:42.747856 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:42.747517 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" event={"ID":"d16fe930-746d-4ac7-b7e6-09d6a8b75d28","Type":"ContainerStarted","Data":"978d3c8c46f12b33c453efe3079402152919a3ab5c7fe5718510d85868e8392c"} Mar 18 16:44:42.748235 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:42.748213 2571 scope.go:117] "RemoveContainer" containerID="bcb971bb8dc07518fd4f8310b9e1b5e69d707bc8da804331985de126e9adaee7" Mar 18 16:44:42.748793 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:42.748777 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:42.749086 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:42.749060 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:42.751395 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:42.751358 2571 generic.go:358] "Generic (PLEG): container finished" podID="943aaf2c-0467-48a6-ac3c-29da845f8444" containerID="842c7d758fbaaf1e8fbb99ddd1a6c42409801de2b215f8a4243f982b0938a803" exitCode=0 Mar 18 16:44:42.751513 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:42.751440 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rc5k" event={"ID":"943aaf2c-0467-48a6-ac3c-29da845f8444","Type":"ContainerDied","Data":"842c7d758fbaaf1e8fbb99ddd1a6c42409801de2b215f8a4243f982b0938a803"} Mar 18 16:44:42.765001 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:42.764977 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:42.765696 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:42.765679 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:43.630757 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:43.630718 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:43.630968 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:43.630876 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:43.631116 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:43.631089 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:43.631289 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:43.631250 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:43.755928 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:43.755910 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 16:44:43.756241 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:43.756222 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" event={"ID":"d16fe930-746d-4ac7-b7e6-09d6a8b75d28","Type":"ContainerStarted","Data":"20f29a8c898f317f3aff43bfb8317c2f6f5c08964a093460a1db7be04e32d59a"} Mar 18 16:44:43.756402 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:43.756386 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 16:44:43.757947 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:43.757926 2571 generic.go:358] "Generic (PLEG): container finished" podID="943aaf2c-0467-48a6-ac3c-29da845f8444" containerID="fecbf45c573eb79253b3818511de691404b0858af9bfc7ac9171ae944e7a81c0" exitCode=0 Mar 18 16:44:43.758044 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:43.757991 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rc5k" event={"ID":"943aaf2c-0467-48a6-ac3c-29da845f8444","Type":"ContainerDied","Data":"fecbf45c573eb79253b3818511de691404b0858af9bfc7ac9171ae944e7a81c0"} Mar 18 16:44:43.787435 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:43.787395 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" podStartSLOduration=9.809327512 podStartE2EDuration="26.787385022s" podCreationTimestamp="2026-03-18 16:44:17 +0000 UTC" firstStartedPulling="2026-03-18 16:44:20.355845582 +0000 UTC m=+3.375505830" lastFinishedPulling="2026-03-18 16:44:37.333903083 +0000 UTC m=+20.353563340" observedRunningTime="2026-03-18 16:44:43.78650559 +0000 UTC m=+26.806165858" watchObservedRunningTime="2026-03-18 16:44:43.787385022 +0000 UTC m=+26.807045321" Mar 18 16:44:44.370773 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:44.370737 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n98cl"] Mar 18 16:44:44.370981 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:44.370876 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:44.371032 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:44.370982 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:44.372875 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:44.372842 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-l26p8"] Mar 18 16:44:44.372989 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:44.372958 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:44.373071 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:44.373048 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l26p8" podUID="263447b9-97dc-43ab-bd1c-1fc05a93e62a" Mar 18 16:44:44.375884 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:44.375852 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mqv2m"] Mar 18 16:44:44.375959 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:44.375952 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:44.376055 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:44.376036 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:44.492900 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:44.492859 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret\") pod \"global-pull-secret-syncer-l26p8\" (UID: \"263447b9-97dc-43ab-bd1c-1fc05a93e62a\") " pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:44.493074 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:44.493042 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:44.493151 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:44.493118 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret podName:263447b9-97dc-43ab-bd1c-1fc05a93e62a nodeName:}" failed. No retries permitted until 2026-03-18 16:44:52.493099249 +0000 UTC m=+35.512759521 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret") pod "global-pull-secret-syncer-l26p8" (UID: "263447b9-97dc-43ab-bd1c-1fc05a93e62a") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:44.762602 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:44.762503 2571 generic.go:358] "Generic (PLEG): container finished" podID="943aaf2c-0467-48a6-ac3c-29da845f8444" containerID="99a2d588bf7cf8b9945b95b3f113e19d2fa3c8a04d3e6ba52f6d356504825865" exitCode=0 Mar 18 16:44:44.763115 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:44.762595 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rc5k" event={"ID":"943aaf2c-0467-48a6-ac3c-29da845f8444","Type":"ContainerDied","Data":"99a2d588bf7cf8b9945b95b3f113e19d2fa3c8a04d3e6ba52f6d356504825865"} Mar 18 16:44:44.763115 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:44.762669 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 16:44:45.630973 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:45.630926 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:45.631152 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:45.631077 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:45.631152 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:45.631112 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:45.631263 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:45.631236 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:45.928514 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:45.928420 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:44:46.630510 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:46.630469 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:46.630669 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:46.630602 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l26p8" podUID="263447b9-97dc-43ab-bd1c-1fc05a93e62a" Mar 18 16:44:47.632425 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:47.632232 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:47.632848 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:47.632287 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:47.632848 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:47.632519 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:47.632848 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:47.632565 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:48.631166 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:48.631069 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:48.631325 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:48.631204 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l26p8" podUID="263447b9-97dc-43ab-bd1c-1fc05a93e62a" Mar 18 16:44:49.630864 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:49.630812 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:49.631246 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:49.630843 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:49.631246 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:49.630954 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:44:49.631246 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:49.631001 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n98cl" podUID="5b14876c-8cf1-4df9-a7ba-bc940ec2ca85" Mar 18 16:44:50.318962 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.318934 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-219.ec2.internal" event="NodeReady" Mar 18 16:44:50.319105 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.319055 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 18 16:44:50.359497 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.359470 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-554f55bc5d-4v6vv"] Mar 18 16:44:50.369990 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.369970 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.372191 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.372168 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-s9hrx\"" Mar 18 16:44:50.372291 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.372172 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 18 16:44:50.372291 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.372285 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 18 16:44:50.372551 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.372533 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Mar 18 16:44:50.375118 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.375098 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-png9k"] Mar 18 16:44:50.378809 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.378790 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 18 16:44:50.386319 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.386302 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8dnhc"] Mar 18 16:44:50.386486 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.386468 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-png9k" Mar 18 16:44:50.388892 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.388640 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 18 16:44:50.388892 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.388755 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zpjpl\"" Mar 18 16:44:50.389035 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.388925 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 18 16:44:50.397023 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.397006 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-554f55bc5d-4v6vv"] Mar 18 16:44:50.397023 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.397027 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-png9k"] Mar 18 16:44:50.397119 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.397100 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:44:50.398657 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.398638 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8dnhc"] Mar 18 16:44:50.399102 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.399085 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 18 16:44:50.399201 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.399183 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9v9vx\"" Mar 18 16:44:50.399269 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.399209 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 18 16:44:50.399269 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.399232 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 18 16:44:50.440248 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.440224 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8cfb9f0-cf5a-4924-9434-6edca1028003-ca-trust-extracted\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.440368 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.440259 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8cfb9f0-cf5a-4924-9434-6edca1028003-installation-pull-secrets\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.440368 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.440284 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.440368 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.440309 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vcjm\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-kube-api-access-8vcjm\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.440368 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.440337 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8cfb9f0-cf5a-4924-9434-6edca1028003-trusted-ca\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.440368 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.440363 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-certificates\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.440572 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.440379 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-bound-sa-token\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.440572 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.440461 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a8cfb9f0-cf5a-4924-9434-6edca1028003-image-registry-private-configuration\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.541705 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.541675 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8cfb9f0-cf5a-4924-9434-6edca1028003-ca-trust-extracted\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.541705 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.541710 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:44:50.541902 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.541739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8cfb9f0-cf5a-4924-9434-6edca1028003-installation-pull-secrets\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.541946 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.541927 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.541996 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.541960 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vcjm\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-kube-api-access-8vcjm\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.541996 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.541980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8cfb9f0-cf5a-4924-9434-6edca1028003-trusted-ca\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.542089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.542002 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19740704-29b3-447d-9452-ddda50ca838f-config-volume\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:44:50.542089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.542023 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fw4g\" (UniqueName: \"kubernetes.io/projected/19740704-29b3-447d-9452-ddda50ca838f-kube-api-access-2fw4g\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:44:50.542089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.542044 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert\") pod \"ingress-canary-8dnhc\" (UID: \"55064175-dfab-4504-98ca-526cf33504e7\") " pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:44:50.542089 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:50.542050 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:50.542089 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:50.542071 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-554f55bc5d-4v6vv: secret "image-registry-tls" not found Mar 18 16:44:50.542089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.542071 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-certificates\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.542089 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.542071 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8cfb9f0-cf5a-4924-9434-6edca1028003-ca-trust-extracted\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.542394 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.542096 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-bound-sa-token\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.542394 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:50.542130 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls podName:a8cfb9f0-cf5a-4924-9434-6edca1028003 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:51.042108922 +0000 UTC m=+34.061769177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls") pod "image-registry-554f55bc5d-4v6vv" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003") : secret "image-registry-tls" not found Mar 18 16:44:50.542394 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.542253 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a8cfb9f0-cf5a-4924-9434-6edca1028003-image-registry-private-configuration\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.542394 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.542312 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l97vp\" (UniqueName: \"kubernetes.io/projected/55064175-dfab-4504-98ca-526cf33504e7-kube-api-access-l97vp\") pod \"ingress-canary-8dnhc\" (UID: \"55064175-dfab-4504-98ca-526cf33504e7\") " pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:44:50.542394 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.542343 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/19740704-29b3-447d-9452-ddda50ca838f-tmp-dir\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:44:50.542638 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.542607 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-certificates\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.542962 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.542941 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8cfb9f0-cf5a-4924-9434-6edca1028003-trusted-ca\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.545766 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.545744 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a8cfb9f0-cf5a-4924-9434-6edca1028003-image-registry-private-configuration\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.545861 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.545751 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8cfb9f0-cf5a-4924-9434-6edca1028003-installation-pull-secrets\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.550579 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.550559 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-bound-sa-token\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.550661 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.550611 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vcjm\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-kube-api-access-8vcjm\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:50.630727 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.630668 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:50.632943 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.632925 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Mar 18 16:44:50.642635 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.642616 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l97vp\" (UniqueName: \"kubernetes.io/projected/55064175-dfab-4504-98ca-526cf33504e7-kube-api-access-l97vp\") pod \"ingress-canary-8dnhc\" (UID: \"55064175-dfab-4504-98ca-526cf33504e7\") " pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:44:50.642724 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.642643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/19740704-29b3-447d-9452-ddda50ca838f-tmp-dir\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:44:50.642724 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.642665 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:44:50.642836 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:50.642752 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:50.642836 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:50.642797 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls podName:19740704-29b3-447d-9452-ddda50ca838f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:51.142784869 +0000 UTC m=+34.162445113 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls") pod "dns-default-png9k" (UID: "19740704-29b3-447d-9452-ddda50ca838f") : secret "dns-default-metrics-tls" not found Mar 18 16:44:50.642836 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.642792 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19740704-29b3-447d-9452-ddda50ca838f-config-volume\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:44:50.642992 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.642850 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fw4g\" (UniqueName: \"kubernetes.io/projected/19740704-29b3-447d-9452-ddda50ca838f-kube-api-access-2fw4g\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:44:50.642992 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.642886 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert\") pod \"ingress-canary-8dnhc\" (UID: \"55064175-dfab-4504-98ca-526cf33504e7\") " pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:44:50.642992 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.642935 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/19740704-29b3-447d-9452-ddda50ca838f-tmp-dir\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:44:50.643127 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:50.643009 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:50.643127 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:50.643092 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert podName:55064175-dfab-4504-98ca-526cf33504e7 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:51.143076627 +0000 UTC m=+34.162736878 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert") pod "ingress-canary-8dnhc" (UID: "55064175-dfab-4504-98ca-526cf33504e7") : secret "canary-serving-cert" not found Mar 18 16:44:50.647866 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.647849 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19740704-29b3-447d-9452-ddda50ca838f-config-volume\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:44:50.650403 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.650387 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fw4g\" (UniqueName: \"kubernetes.io/projected/19740704-29b3-447d-9452-ddda50ca838f-kube-api-access-2fw4g\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:44:50.650547 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.650531 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l97vp\" (UniqueName: \"kubernetes.io/projected/55064175-dfab-4504-98ca-526cf33504e7-kube-api-access-l97vp\") pod \"ingress-canary-8dnhc\" (UID: \"55064175-dfab-4504-98ca-526cf33504e7\") " pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:44:50.775899 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:50.775864 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rc5k" event={"ID":"943aaf2c-0467-48a6-ac3c-29da845f8444","Type":"ContainerStarted","Data":"a63d6cc027060f59c45bc22c6133d3776af38e144a64c5dc7d8b361f348d7e7a"} Mar 18 16:44:51.045712 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:51.045682 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:51.045904 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:51.045848 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:51.045904 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:51.045868 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-554f55bc5d-4v6vv: secret "image-registry-tls" not found Mar 18 16:44:51.045979 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:51.045935 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls podName:a8cfb9f0-cf5a-4924-9434-6edca1028003 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:52.045919827 +0000 UTC m=+35.065580076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls") pod "image-registry-554f55bc5d-4v6vv" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003") : secret "image-registry-tls" not found Mar 18 16:44:51.146458 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:51.146398 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:44:51.146458 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:51.146455 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert\") pod \"ingress-canary-8dnhc\" (UID: \"55064175-dfab-4504-98ca-526cf33504e7\") " pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:44:51.146597 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:51.146535 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:51.146597 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:51.146540 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:51.146597 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:51.146580 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert podName:55064175-dfab-4504-98ca-526cf33504e7 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:52.14656743 +0000 UTC m=+35.166227675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert") pod "ingress-canary-8dnhc" (UID: "55064175-dfab-4504-98ca-526cf33504e7") : secret "canary-serving-cert" not found Mar 18 16:44:51.146698 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:51.146602 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls podName:19740704-29b3-447d-9452-ddda50ca838f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:52.146585258 +0000 UTC m=+35.166245502 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls") pod "dns-default-png9k" (UID: "19740704-29b3-447d-9452-ddda50ca838f") : secret "dns-default-metrics-tls" not found Mar 18 16:44:51.247389 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:51.247364 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs\") pod \"network-metrics-daemon-mqv2m\" (UID: \"268f643d-a996-4e11-b14c-70264a1426da\") " pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:51.247537 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:51.247458 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:51.247537 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:51.247504 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs podName:268f643d-a996-4e11-b14c-70264a1426da nodeName:}" failed. No retries permitted until 2026-03-18 16:45:23.247492665 +0000 UTC m=+66.267152910 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs") pod "network-metrics-daemon-mqv2m" (UID: "268f643d-a996-4e11-b14c-70264a1426da") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:51.448643 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:51.448579 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfl6g\" (UniqueName: \"kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g\") pod \"network-check-target-n98cl\" (UID: \"5b14876c-8cf1-4df9-a7ba-bc940ec2ca85\") " pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:51.448815 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:51.448742 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:51.448815 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:51.448763 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:51.448815 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:51.448773 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mfl6g for pod openshift-network-diagnostics/network-check-target-n98cl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:51.448928 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:51.448835 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g podName:5b14876c-8cf1-4df9-a7ba-bc940ec2ca85 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:23.448808031 +0000 UTC m=+66.468468275 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-mfl6g" (UniqueName: "kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g") pod "network-check-target-n98cl" (UID: "5b14876c-8cf1-4df9-a7ba-bc940ec2ca85") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:51.630586 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:51.630558 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:44:51.630586 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:51.630575 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:44:51.632834 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:51.632802 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:44:51.633144 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:51.633123 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-97tcn\"" Mar 18 16:44:51.633144 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:51.633131 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:44:51.633583 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:51.633131 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:44:51.633583 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:51.633144 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wwjb5\"" Mar 18 16:44:51.779922 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:51.779865 2571 generic.go:358] "Generic (PLEG): container finished" podID="943aaf2c-0467-48a6-ac3c-29da845f8444" containerID="a63d6cc027060f59c45bc22c6133d3776af38e144a64c5dc7d8b361f348d7e7a" exitCode=0 Mar 18 16:44:51.779922 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:51.779901 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rc5k" event={"ID":"943aaf2c-0467-48a6-ac3c-29da845f8444","Type":"ContainerDied","Data":"a63d6cc027060f59c45bc22c6133d3776af38e144a64c5dc7d8b361f348d7e7a"} Mar 18 16:44:52.054865 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:52.054839 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:52.054997 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:52.054980 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:52.055042 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:52.055000 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-554f55bc5d-4v6vv: secret "image-registry-tls" not found Mar 18 16:44:52.055075 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:52.055053 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls podName:a8cfb9f0-cf5a-4924-9434-6edca1028003 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:54.055033408 +0000 UTC m=+37.074693655 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls") pod "image-registry-554f55bc5d-4v6vv" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003") : secret "image-registry-tls" not found Mar 18 16:44:52.155122 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:52.155098 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:44:52.155210 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:52.155153 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert\") pod \"ingress-canary-8dnhc\" (UID: \"55064175-dfab-4504-98ca-526cf33504e7\") " pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:44:52.155249 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:52.155231 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:52.155291 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:52.155246 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:52.155291 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:52.155283 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls podName:19740704-29b3-447d-9452-ddda50ca838f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:54.155269755 +0000 UTC m=+37.174930000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls") pod "dns-default-png9k" (UID: "19740704-29b3-447d-9452-ddda50ca838f") : secret "dns-default-metrics-tls" not found Mar 18 16:44:52.155358 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:52.155296 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert podName:55064175-dfab-4504-98ca-526cf33504e7 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:54.15529024 +0000 UTC m=+37.174950485 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert") pod "ingress-canary-8dnhc" (UID: "55064175-dfab-4504-98ca-526cf33504e7") : secret "canary-serving-cert" not found Mar 18 16:44:52.556636 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:52.556610 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret\") pod \"global-pull-secret-syncer-l26p8\" (UID: \"263447b9-97dc-43ab-bd1c-1fc05a93e62a\") " pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:52.558964 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:52.558947 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/263447b9-97dc-43ab-bd1c-1fc05a93e62a-original-pull-secret\") pod \"global-pull-secret-syncer-l26p8\" (UID: \"263447b9-97dc-43ab-bd1c-1fc05a93e62a\") " pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:52.740062 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:52.740033 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l26p8" Mar 18 16:44:52.785076 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:52.785043 2571 generic.go:358] "Generic (PLEG): container finished" podID="943aaf2c-0467-48a6-ac3c-29da845f8444" containerID="329f3a42447ec2cf17057ab1b4dfe334e51c927aa91310e40339a4da65c125ac" exitCode=0 Mar 18 16:44:52.785227 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:52.785096 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rc5k" event={"ID":"943aaf2c-0467-48a6-ac3c-29da845f8444","Type":"ContainerDied","Data":"329f3a42447ec2cf17057ab1b4dfe334e51c927aa91310e40339a4da65c125ac"} Mar 18 16:44:52.905352 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:52.905207 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-l26p8"] Mar 18 16:44:52.916900 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:44:52.916875 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod263447b9_97dc_43ab_bd1c_1fc05a93e62a.slice/crio-2e718a2b1a6c05492ea223c04471e2052e7c4d41b53a4098d0fd998368713b56 WatchSource:0}: Error finding container 2e718a2b1a6c05492ea223c04471e2052e7c4d41b53a4098d0fd998368713b56: Status 404 returned error can't find the container with id 2e718a2b1a6c05492ea223c04471e2052e7c4d41b53a4098d0fd998368713b56 Mar 18 16:44:53.790275 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:53.790241 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rc5k" event={"ID":"943aaf2c-0467-48a6-ac3c-29da845f8444","Type":"ContainerStarted","Data":"f634e49498b590cd287b3cf6d3a4563b654ea52868a0498c147c1a2cefa2fbf7"} Mar 18 16:44:53.791305 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:53.791282 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-l26p8" event={"ID":"263447b9-97dc-43ab-bd1c-1fc05a93e62a","Type":"ContainerStarted","Data":"2e718a2b1a6c05492ea223c04471e2052e7c4d41b53a4098d0fd998368713b56"} Mar 18 16:44:53.825012 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:53.824961 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5rc5k" podStartSLOduration=6.521110538 podStartE2EDuration="36.824948102s" podCreationTimestamp="2026-03-18 16:44:17 +0000 UTC" firstStartedPulling="2026-03-18 16:44:20.329425376 +0000 UTC m=+3.349085622" lastFinishedPulling="2026-03-18 16:44:50.633262933 +0000 UTC m=+33.652923186" observedRunningTime="2026-03-18 16:44:53.824280072 +0000 UTC m=+36.843940340" watchObservedRunningTime="2026-03-18 16:44:53.824948102 +0000 UTC m=+36.844608364" Mar 18 16:44:54.069867 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:54.069771 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:54.070023 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:54.069941 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:54.070023 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:54.069962 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-554f55bc5d-4v6vv: secret "image-registry-tls" not found Mar 18 16:44:54.070130 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:54.070032 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls podName:a8cfb9f0-cf5a-4924-9434-6edca1028003 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:58.070010913 +0000 UTC m=+41.089671162 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls") pod "image-registry-554f55bc5d-4v6vv" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003") : secret "image-registry-tls" not found Mar 18 16:44:54.170373 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:54.170334 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert\") pod \"ingress-canary-8dnhc\" (UID: \"55064175-dfab-4504-98ca-526cf33504e7\") " pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:44:54.170545 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:54.170398 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:44:54.170545 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:54.170510 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:54.170545 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:54.170523 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:54.170686 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:54.170583 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert podName:55064175-dfab-4504-98ca-526cf33504e7 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:58.170562129 +0000 UTC m=+41.190222385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert") pod "ingress-canary-8dnhc" (UID: "55064175-dfab-4504-98ca-526cf33504e7") : secret "canary-serving-cert" not found Mar 18 16:44:54.170686 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:54.170604 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls podName:19740704-29b3-447d-9452-ddda50ca838f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:58.170594042 +0000 UTC m=+41.190254289 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls") pod "dns-default-png9k" (UID: "19740704-29b3-447d-9452-ddda50ca838f") : secret "dns-default-metrics-tls" not found Mar 18 16:44:57.800544 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:57.800507 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-l26p8" event={"ID":"263447b9-97dc-43ab-bd1c-1fc05a93e62a","Type":"ContainerStarted","Data":"8caec196aeb2038413d0db3a764ea64ea69870acf860d13e448e52b114ccfed1"} Mar 18 16:44:57.822645 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:57.822600 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-l26p8" podStartSLOduration=17.950421789 podStartE2EDuration="21.822587101s" podCreationTimestamp="2026-03-18 16:44:36 +0000 UTC" firstStartedPulling="2026-03-18 16:44:52.918842388 +0000 UTC m=+35.938502632" lastFinishedPulling="2026-03-18 16:44:56.791007688 +0000 UTC m=+39.810667944" observedRunningTime="2026-03-18 16:44:57.822555245 +0000 UTC m=+40.842215514" watchObservedRunningTime="2026-03-18 16:44:57.822587101 +0000 UTC m=+40.842247360" Mar 18 16:44:58.102898 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:58.102797 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:44:58.103042 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:58.102946 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:58.103042 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:58.102966 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-554f55bc5d-4v6vv: secret "image-registry-tls" not found Mar 18 16:44:58.103042 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:58.103021 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls podName:a8cfb9f0-cf5a-4924-9434-6edca1028003 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:06.10300628 +0000 UTC m=+49.122666525 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls") pod "image-registry-554f55bc5d-4v6vv" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003") : secret "image-registry-tls" not found Mar 18 16:44:58.203673 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:58.203629 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:44:58.203872 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:58.203783 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:58.203872 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:44:58.203789 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert\") pod \"ingress-canary-8dnhc\" (UID: \"55064175-dfab-4504-98ca-526cf33504e7\") " pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:44:58.203872 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:58.203865 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:58.203974 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:58.203866 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls podName:19740704-29b3-447d-9452-ddda50ca838f nodeName:}" failed. No retries permitted until 2026-03-18 16:45:06.203845506 +0000 UTC m=+49.223505767 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls") pod "dns-default-png9k" (UID: "19740704-29b3-447d-9452-ddda50ca838f") : secret "dns-default-metrics-tls" not found Mar 18 16:44:58.203974 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:44:58.203923 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert podName:55064175-dfab-4504-98ca-526cf33504e7 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:06.203912293 +0000 UTC m=+49.223572538 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert") pod "ingress-canary-8dnhc" (UID: "55064175-dfab-4504-98ca-526cf33504e7") : secret "canary-serving-cert" not found Mar 18 16:45:00.409040 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.409001 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs"] Mar 18 16:45:00.411979 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.411958 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs" Mar 18 16:45:00.412278 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.412189 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9"] Mar 18 16:45:00.414553 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.414521 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Mar 18 16:45:00.414895 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.414800 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Mar 18 16:45:00.415318 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.415292 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-k5ttv\"" Mar 18 16:45:00.415796 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.415776 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Mar 18 16:45:00.416077 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.416051 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Mar 18 16:45:00.417759 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.417738 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk"] Mar 18 16:45:00.417931 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.417915 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:45:00.420070 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.419870 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Mar 18 16:45:00.420937 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.420917 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.421266 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.421125 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs"] Mar 18 16:45:00.422628 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.422608 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Mar 18 16:45:00.423022 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.423003 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Mar 18 16:45:00.423126 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.423023 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Mar 18 16:45:00.423195 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.423144 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Mar 18 16:45:00.424489 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.424471 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9"] Mar 18 16:45:00.425374 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.425358 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk"] Mar 18 16:45:00.519206 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.519175 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbsvg\" (UniqueName: \"kubernetes.io/projected/0de4261d-9a83-416f-929f-ee7c9b6a701c-kube-api-access-nbsvg\") pod \"managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs\" (UID: \"0de4261d-9a83-416f-929f-ee7c9b6a701c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs" Mar 18 16:45:00.519371 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.519237 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0de4261d-9a83-416f-929f-ee7c9b6a701c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs\" (UID: \"0de4261d-9a83-416f-929f-ee7c9b6a701c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs" Mar 18 16:45:00.519371 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.519278 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/56cf01a2-4433-4ec4-9d33-a321d6ddccab-tmp\") pod \"klusterlet-addon-workmgr-566b78785d-tvjr9\" (UID: \"56cf01a2-4433-4ec4-9d33-a321d6ddccab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:45:00.519371 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.519297 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/56cf01a2-4433-4ec4-9d33-a321d6ddccab-klusterlet-config\") pod \"klusterlet-addon-workmgr-566b78785d-tvjr9\" (UID: \"56cf01a2-4433-4ec4-9d33-a321d6ddccab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:45:00.519371 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.519319 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-ca\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.519371 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.519369 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.519553 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.519402 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68jxm\" (UniqueName: \"kubernetes.io/projected/56cf01a2-4433-4ec4-9d33-a321d6ddccab-kube-api-access-68jxm\") pod \"klusterlet-addon-workmgr-566b78785d-tvjr9\" (UID: \"56cf01a2-4433-4ec4-9d33-a321d6ddccab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:45:00.519553 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.519421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.519553 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.519442 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95qkd\" (UniqueName: \"kubernetes.io/projected/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-kube-api-access-95qkd\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.519553 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.519482 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.519553 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.519508 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-hub\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.620125 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.620087 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0de4261d-9a83-416f-929f-ee7c9b6a701c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs\" (UID: \"0de4261d-9a83-416f-929f-ee7c9b6a701c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs" Mar 18 16:45:00.620125 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.620131 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/56cf01a2-4433-4ec4-9d33-a321d6ddccab-tmp\") pod \"klusterlet-addon-workmgr-566b78785d-tvjr9\" (UID: \"56cf01a2-4433-4ec4-9d33-a321d6ddccab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:45:00.620386 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.620153 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/56cf01a2-4433-4ec4-9d33-a321d6ddccab-klusterlet-config\") pod \"klusterlet-addon-workmgr-566b78785d-tvjr9\" (UID: \"56cf01a2-4433-4ec4-9d33-a321d6ddccab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:45:00.620452 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.620417 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-ca\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.620506 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.620475 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.620559 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.620512 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68jxm\" (UniqueName: \"kubernetes.io/projected/56cf01a2-4433-4ec4-9d33-a321d6ddccab-kube-api-access-68jxm\") pod \"klusterlet-addon-workmgr-566b78785d-tvjr9\" (UID: \"56cf01a2-4433-4ec4-9d33-a321d6ddccab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:45:00.620559 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.620534 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.620660 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.620560 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/56cf01a2-4433-4ec4-9d33-a321d6ddccab-tmp\") pod \"klusterlet-addon-workmgr-566b78785d-tvjr9\" (UID: \"56cf01a2-4433-4ec4-9d33-a321d6ddccab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:45:00.620660 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.620562 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95qkd\" (UniqueName: \"kubernetes.io/projected/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-kube-api-access-95qkd\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.620660 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.620635 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.620799 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.620679 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-hub\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.620799 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.620712 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbsvg\" (UniqueName: \"kubernetes.io/projected/0de4261d-9a83-416f-929f-ee7c9b6a701c-kube-api-access-nbsvg\") pod \"managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs\" (UID: \"0de4261d-9a83-416f-929f-ee7c9b6a701c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs" Mar 18 16:45:00.621161 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.621129 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.624273 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.624242 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-ca\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.624273 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.624266 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/56cf01a2-4433-4ec4-9d33-a321d6ddccab-klusterlet-config\") pod \"klusterlet-addon-workmgr-566b78785d-tvjr9\" (UID: \"56cf01a2-4433-4ec4-9d33-a321d6ddccab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:45:00.624408 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.624325 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.624408 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.624348 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-hub\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.624408 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.624377 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0de4261d-9a83-416f-929f-ee7c9b6a701c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs\" (UID: \"0de4261d-9a83-416f-929f-ee7c9b6a701c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs" Mar 18 16:45:00.624603 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.624584 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.628354 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.628327 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68jxm\" (UniqueName: \"kubernetes.io/projected/56cf01a2-4433-4ec4-9d33-a321d6ddccab-kube-api-access-68jxm\") pod \"klusterlet-addon-workmgr-566b78785d-tvjr9\" (UID: \"56cf01a2-4433-4ec4-9d33-a321d6ddccab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:45:00.628480 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.628463 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95qkd\" (UniqueName: \"kubernetes.io/projected/8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d-kube-api-access-95qkd\") pod \"cluster-proxy-proxy-agent-c5965f9c5-wsbtk\" (UID: \"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.628539 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.628482 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbsvg\" (UniqueName: \"kubernetes.io/projected/0de4261d-9a83-416f-929f-ee7c9b6a701c-kube-api-access-nbsvg\") pod \"managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs\" (UID: \"0de4261d-9a83-416f-929f-ee7c9b6a701c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs" Mar 18 16:45:00.733998 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.733913 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs" Mar 18 16:45:00.740048 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.740026 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:45:00.745839 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.745789 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:45:00.880051 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.880024 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs"] Mar 18 16:45:00.883791 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:45:00.883764 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0de4261d_9a83_416f_929f_ee7c9b6a701c.slice/crio-a528c1990cb358aba892ac8512254114a2241ffd3462fae0f8249cabd55a790a WatchSource:0}: Error finding container a528c1990cb358aba892ac8512254114a2241ffd3462fae0f8249cabd55a790a: Status 404 returned error can't find the container with id a528c1990cb358aba892ac8512254114a2241ffd3462fae0f8249cabd55a790a Mar 18 16:45:00.891552 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.891530 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9"] Mar 18 16:45:00.894615 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:45:00.894587 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56cf01a2_4433_4ec4_9d33_a321d6ddccab.slice/crio-b64c3ddd8467670271f241a1fc7e539c418d9b606c94d217c086e0bcdbed4261 WatchSource:0}: Error finding container b64c3ddd8467670271f241a1fc7e539c418d9b606c94d217c086e0bcdbed4261: Status 404 returned error can't find the container with id b64c3ddd8467670271f241a1fc7e539c418d9b606c94d217c086e0bcdbed4261 Mar 18 16:45:00.907210 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:00.907187 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk"] Mar 18 16:45:00.910257 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:45:00.910238 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8013e8ca_ad3c_4c29_99f3_81fc6aa8ed3d.slice/crio-77276825c015fc60860c1da570637700bff47131e906446421e2d6fb7f919b1b WatchSource:0}: Error finding container 77276825c015fc60860c1da570637700bff47131e906446421e2d6fb7f919b1b: Status 404 returned error can't find the container with id 77276825c015fc60860c1da570637700bff47131e906446421e2d6fb7f919b1b Mar 18 16:45:01.809215 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:01.809177 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs" event={"ID":"0de4261d-9a83-416f-929f-ee7c9b6a701c","Type":"ContainerStarted","Data":"a528c1990cb358aba892ac8512254114a2241ffd3462fae0f8249cabd55a790a"} Mar 18 16:45:01.810131 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:01.810108 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" event={"ID":"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d","Type":"ContainerStarted","Data":"77276825c015fc60860c1da570637700bff47131e906446421e2d6fb7f919b1b"} Mar 18 16:45:01.810967 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:01.810944 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" event={"ID":"56cf01a2-4433-4ec4-9d33-a321d6ddccab","Type":"ContainerStarted","Data":"b64c3ddd8467670271f241a1fc7e539c418d9b606c94d217c086e0bcdbed4261"} Mar 18 16:45:06.173246 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:06.173205 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:45:06.173676 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:06.173392 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:45:06.173676 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:06.173415 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-554f55bc5d-4v6vv: secret "image-registry-tls" not found Mar 18 16:45:06.173676 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:06.173489 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls podName:a8cfb9f0-cf5a-4924-9434-6edca1028003 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:22.173465725 +0000 UTC m=+65.193125972 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls") pod "image-registry-554f55bc5d-4v6vv" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003") : secret "image-registry-tls" not found Mar 18 16:45:06.274316 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:06.274271 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert\") pod \"ingress-canary-8dnhc\" (UID: \"55064175-dfab-4504-98ca-526cf33504e7\") " pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:45:06.274474 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:06.274367 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:45:06.274474 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:06.274407 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:06.274474 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:06.274473 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert podName:55064175-dfab-4504-98ca-526cf33504e7 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:22.274455736 +0000 UTC m=+65.294115982 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert") pod "ingress-canary-8dnhc" (UID: "55064175-dfab-4504-98ca-526cf33504e7") : secret "canary-serving-cert" not found Mar 18 16:45:06.274595 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:06.274485 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:06.274595 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:06.274534 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls podName:19740704-29b3-447d-9452-ddda50ca838f nodeName:}" failed. No retries permitted until 2026-03-18 16:45:22.274518862 +0000 UTC m=+65.294179123 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls") pod "dns-default-png9k" (UID: "19740704-29b3-447d-9452-ddda50ca838f") : secret "dns-default-metrics-tls" not found Mar 18 16:45:06.826182 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:06.826135 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" event={"ID":"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d","Type":"ContainerStarted","Data":"301c5ad76166cc16da2cc07f5d786cb46bb065ac1f2e28ead80803db3e190786"} Mar 18 16:45:06.827585 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:06.827555 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs" event={"ID":"0de4261d-9a83-416f-929f-ee7c9b6a701c","Type":"ContainerStarted","Data":"05dce7adeb94040d10dfdbd5588efd1ff168227d6a4340847847db9d400bcdac"} Mar 18 16:45:06.842744 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:06.842695 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs" podStartSLOduration=2.000354254 podStartE2EDuration="6.842676232s" podCreationTimestamp="2026-03-18 16:45:00 +0000 UTC" firstStartedPulling="2026-03-18 16:45:00.885803213 +0000 UTC m=+43.905463461" lastFinishedPulling="2026-03-18 16:45:05.728125187 +0000 UTC m=+48.747785439" observedRunningTime="2026-03-18 16:45:06.841712987 +0000 UTC m=+49.861373253" watchObservedRunningTime="2026-03-18 16:45:06.842676232 +0000 UTC m=+49.862336501" Mar 18 16:45:08.833275 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:08.833235 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" event={"ID":"56cf01a2-4433-4ec4-9d33-a321d6ddccab","Type":"ContainerStarted","Data":"b13923098340073c4cef130375e27820909304782583eb44048333ea5717d2ce"} Mar 18 16:45:08.833893 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:08.833462 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:45:08.835380 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:08.835358 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:45:08.851482 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:08.851429 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" podStartSLOduration=1.7947956459999999 podStartE2EDuration="8.851415046s" podCreationTimestamp="2026-03-18 16:45:00 +0000 UTC" firstStartedPulling="2026-03-18 16:45:00.896290173 +0000 UTC m=+43.915950418" lastFinishedPulling="2026-03-18 16:45:07.952909557 +0000 UTC m=+50.972569818" observedRunningTime="2026-03-18 16:45:08.848338569 +0000 UTC m=+51.867998830" watchObservedRunningTime="2026-03-18 16:45:08.851415046 +0000 UTC m=+51.871075312" Mar 18 16:45:09.837466 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:09.837430 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" event={"ID":"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d","Type":"ContainerStarted","Data":"b93e9ff3e75156c3d1c691bb4481788fd0285e57523796beb48ea68b2a492646"} Mar 18 16:45:10.841338 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:10.841298 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" event={"ID":"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d","Type":"ContainerStarted","Data":"d601ee6f6f87406fd6a650a9a402823b814c6183e192bbef9328281778f24aa4"} Mar 18 16:45:10.864683 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:10.864641 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" podStartSLOduration=2.087038498 podStartE2EDuration="10.864625909s" podCreationTimestamp="2026-03-18 16:45:00 +0000 UTC" firstStartedPulling="2026-03-18 16:45:00.911742543 +0000 UTC m=+43.931402789" lastFinishedPulling="2026-03-18 16:45:09.689329941 +0000 UTC m=+52.708990200" observedRunningTime="2026-03-18 16:45:10.863966415 +0000 UTC m=+53.883626681" watchObservedRunningTime="2026-03-18 16:45:10.864625909 +0000 UTC m=+53.884286176" Mar 18 16:45:15.939281 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:15.939254 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wxnmv" Mar 18 16:45:22.185916 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:22.185879 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:45:22.186309 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:22.185988 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:45:22.186309 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:22.186000 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-554f55bc5d-4v6vv: secret "image-registry-tls" not found Mar 18 16:45:22.186309 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:22.186051 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls podName:a8cfb9f0-cf5a-4924-9434-6edca1028003 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:54.186036659 +0000 UTC m=+97.205696903 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls") pod "image-registry-554f55bc5d-4v6vv" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003") : secret "image-registry-tls" not found Mar 18 16:45:22.286191 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:22.286159 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:45:22.286290 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:22.286224 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert\") pod \"ingress-canary-8dnhc\" (UID: \"55064175-dfab-4504-98ca-526cf33504e7\") " pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:45:22.286361 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:22.286338 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:22.286433 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:22.286421 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls podName:19740704-29b3-447d-9452-ddda50ca838f nodeName:}" failed. No retries permitted until 2026-03-18 16:45:54.286397848 +0000 UTC m=+97.306058103 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls") pod "dns-default-png9k" (UID: "19740704-29b3-447d-9452-ddda50ca838f") : secret "dns-default-metrics-tls" not found Mar 18 16:45:22.286473 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:22.286353 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:22.286505 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:22.286474 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert podName:55064175-dfab-4504-98ca-526cf33504e7 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:54.286460305 +0000 UTC m=+97.306120549 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert") pod "ingress-canary-8dnhc" (UID: "55064175-dfab-4504-98ca-526cf33504e7") : secret "canary-serving-cert" not found Mar 18 16:45:23.293302 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:23.293253 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs\") pod \"network-metrics-daemon-mqv2m\" (UID: \"268f643d-a996-4e11-b14c-70264a1426da\") " pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:45:23.295337 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:23.295315 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:45:23.303803 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:23.303781 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:45:23.303899 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:23.303860 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs podName:268f643d-a996-4e11-b14c-70264a1426da nodeName:}" failed. No retries permitted until 2026-03-18 16:46:27.303840394 +0000 UTC m=+130.323500656 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs") pod "network-metrics-daemon-mqv2m" (UID: "268f643d-a996-4e11-b14c-70264a1426da") : secret "metrics-daemon-secret" not found Mar 18 16:45:23.494515 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:23.494471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfl6g\" (UniqueName: \"kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g\") pod \"network-check-target-n98cl\" (UID: \"5b14876c-8cf1-4df9-a7ba-bc940ec2ca85\") " pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:45:23.496651 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:23.496631 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:45:23.506846 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:23.506810 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:45:23.517572 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:23.517548 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfl6g\" (UniqueName: \"kubernetes.io/projected/5b14876c-8cf1-4df9-a7ba-bc940ec2ca85-kube-api-access-mfl6g\") pod \"network-check-target-n98cl\" (UID: \"5b14876c-8cf1-4df9-a7ba-bc940ec2ca85\") " pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:45:23.747291 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:23.747211 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wwjb5\"" Mar 18 16:45:23.755782 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:23.755761 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:45:23.866353 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:23.866323 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n98cl"] Mar 18 16:45:23.869388 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:45:23.869363 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b14876c_8cf1_4df9_a7ba_bc940ec2ca85.slice/crio-1845543fbeffdfb1dedc6019f1b251c4e9ecfb7dbb1f0aadeb0b8b0dfcc6814b WatchSource:0}: Error finding container 1845543fbeffdfb1dedc6019f1b251c4e9ecfb7dbb1f0aadeb0b8b0dfcc6814b: Status 404 returned error can't find the container with id 1845543fbeffdfb1dedc6019f1b251c4e9ecfb7dbb1f0aadeb0b8b0dfcc6814b Mar 18 16:45:23.878616 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:23.878589 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n98cl" event={"ID":"5b14876c-8cf1-4df9-a7ba-bc940ec2ca85","Type":"ContainerStarted","Data":"1845543fbeffdfb1dedc6019f1b251c4e9ecfb7dbb1f0aadeb0b8b0dfcc6814b"} Mar 18 16:45:26.886760 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:26.886673 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n98cl" event={"ID":"5b14876c-8cf1-4df9-a7ba-bc940ec2ca85","Type":"ContainerStarted","Data":"aafcb52f640d8bb7fd0a526404de35faa4b1b94a859dbaab9d75dd04bfb0f277"} Mar 18 16:45:26.887213 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:26.886828 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:45:26.902573 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:26.902528 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-n98cl" podStartSLOduration=67.172047999 podStartE2EDuration="1m9.902516093s" podCreationTimestamp="2026-03-18 16:44:17 +0000 UTC" firstStartedPulling="2026-03-18 16:45:23.871280989 +0000 UTC m=+66.890941238" lastFinishedPulling="2026-03-18 16:45:26.601749083 +0000 UTC m=+69.621409332" observedRunningTime="2026-03-18 16:45:26.901491633 +0000 UTC m=+69.921151899" watchObservedRunningTime="2026-03-18 16:45:26.902516093 +0000 UTC m=+69.922176359" Mar 18 16:45:54.207460 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:54.207372 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:45:54.207868 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:54.207518 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:45:54.207868 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:54.207537 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-554f55bc5d-4v6vv: secret "image-registry-tls" not found Mar 18 16:45:54.207868 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:54.207603 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls podName:a8cfb9f0-cf5a-4924-9434-6edca1028003 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:58.207584786 +0000 UTC m=+161.227245036 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls") pod "image-registry-554f55bc5d-4v6vv" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003") : secret "image-registry-tls" not found Mar 18 16:45:54.307889 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:54.307849 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:45:54.307961 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:54.307918 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert\") pod \"ingress-canary-8dnhc\" (UID: \"55064175-dfab-4504-98ca-526cf33504e7\") " pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:45:54.308044 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:54.307999 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:54.308088 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:54.308077 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls podName:19740704-29b3-447d-9452-ddda50ca838f nodeName:}" failed. No retries permitted until 2026-03-18 16:46:58.308060953 +0000 UTC m=+161.327721198 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls") pod "dns-default-png9k" (UID: "19740704-29b3-447d-9452-ddda50ca838f") : secret "dns-default-metrics-tls" not found Mar 18 16:45:54.308129 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:54.308004 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:54.308161 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:45:54.308131 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert podName:55064175-dfab-4504-98ca-526cf33504e7 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:58.308120416 +0000 UTC m=+161.327780662 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert") pod "ingress-canary-8dnhc" (UID: "55064175-dfab-4504-98ca-526cf33504e7") : secret "canary-serving-cert" not found Mar 18 16:45:57.891666 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:45:57.891641 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-n98cl" Mar 18 16:46:27.345812 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:27.345759 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs\") pod \"network-metrics-daemon-mqv2m\" (UID: \"268f643d-a996-4e11-b14c-70264a1426da\") " pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:46:27.346327 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:46:27.345930 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:46:27.346327 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:46:27.346004 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs podName:268f643d-a996-4e11-b14c-70264a1426da nodeName:}" failed. No retries permitted until 2026-03-18 16:48:29.34598868 +0000 UTC m=+252.365648925 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs") pod "network-metrics-daemon-mqv2m" (UID: "268f643d-a996-4e11-b14c-70264a1426da") : secret "metrics-daemon-secret" not found Mar 18 16:46:43.408754 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:43.408724 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zpjht_a6ec8ca6-7754-4403-be16-a059d6c3184b/dns-node-resolver/0.log" Mar 18 16:46:44.208791 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:44.208760 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sv4zc_8ea30226-c756-46d3-b584-3613c9c03311/node-ca/0.log" Mar 18 16:46:53.380534 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:46:53.380492 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" podUID="a8cfb9f0-cf5a-4924-9434-6edca1028003" Mar 18 16:46:53.397617 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:46:53.397582 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-png9k" podUID="19740704-29b3-447d-9452-ddda50ca838f" Mar 18 16:46:53.405739 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:46:53.405713 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-8dnhc" podUID="55064175-dfab-4504-98ca-526cf33504e7" Mar 18 16:46:54.086525 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:54.086500 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:46:54.086697 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:54.086505 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-png9k" Mar 18 16:46:54.086697 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:54.086505 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:46:54.640946 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:46:54.640891 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-mqv2m" podUID="268f643d-a996-4e11-b14c-70264a1426da" Mar 18 16:46:58.279206 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:58.279130 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:46:58.281362 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:58.281340 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls\") pod \"image-registry-554f55bc5d-4v6vv\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:46:58.289769 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:58.289749 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-s9hrx\"" Mar 18 16:46:58.297555 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:58.297536 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:46:58.379926 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:58.379893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:46:58.380109 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:58.379956 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert\") pod \"ingress-canary-8dnhc\" (UID: \"55064175-dfab-4504-98ca-526cf33504e7\") " pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:46:58.382279 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:58.382251 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19740704-29b3-447d-9452-ddda50ca838f-metrics-tls\") pod \"dns-default-png9k\" (UID: \"19740704-29b3-447d-9452-ddda50ca838f\") " pod="openshift-dns/dns-default-png9k" Mar 18 16:46:58.382390 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:58.382311 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55064175-dfab-4504-98ca-526cf33504e7-cert\") pod \"ingress-canary-8dnhc\" (UID: \"55064175-dfab-4504-98ca-526cf33504e7\") " pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:46:58.422815 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:58.422783 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-554f55bc5d-4v6vv"] Mar 18 16:46:58.425974 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:46:58.425946 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8cfb9f0_cf5a_4924_9434_6edca1028003.slice/crio-f021e17421447eff27e29448c3a9a8010e74c59c676300a01bb34cb6886978ad WatchSource:0}: Error finding container f021e17421447eff27e29448c3a9a8010e74c59c676300a01bb34cb6886978ad: Status 404 returned error can't find the container with id f021e17421447eff27e29448c3a9a8010e74c59c676300a01bb34cb6886978ad Mar 18 16:46:58.589690 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:58.589635 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9v9vx\"" Mar 18 16:46:58.589872 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:58.589763 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zpjpl\"" Mar 18 16:46:58.598620 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:58.598599 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-png9k" Mar 18 16:46:58.598692 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:58.598665 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8dnhc" Mar 18 16:46:58.731630 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:58.731585 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8dnhc"] Mar 18 16:46:58.735294 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:46:58.735266 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55064175_dfab_4504_98ca_526cf33504e7.slice/crio-4d1c7cfe2836503f976066b46ea71970e1b2996c2e4ba890ee735e1b5b4cc473 WatchSource:0}: Error finding container 4d1c7cfe2836503f976066b46ea71970e1b2996c2e4ba890ee735e1b5b4cc473: Status 404 returned error can't find the container with id 4d1c7cfe2836503f976066b46ea71970e1b2996c2e4ba890ee735e1b5b4cc473 Mar 18 16:46:58.751486 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:58.751431 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-png9k"] Mar 18 16:46:58.756748 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:46:58.756721 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19740704_29b3_447d_9452_ddda50ca838f.slice/crio-84feb43705f6f1d5325942ec9bd9ec1ab38f04a4c72aa411b324669ca28f1a4f WatchSource:0}: Error finding container 84feb43705f6f1d5325942ec9bd9ec1ab38f04a4c72aa411b324669ca28f1a4f: Status 404 returned error can't find the container with id 84feb43705f6f1d5325942ec9bd9ec1ab38f04a4c72aa411b324669ca28f1a4f Mar 18 16:46:59.099974 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:59.099940 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8dnhc" event={"ID":"55064175-dfab-4504-98ca-526cf33504e7","Type":"ContainerStarted","Data":"4d1c7cfe2836503f976066b46ea71970e1b2996c2e4ba890ee735e1b5b4cc473"} Mar 18 16:46:59.101227 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:59.101201 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" event={"ID":"a8cfb9f0-cf5a-4924-9434-6edca1028003","Type":"ContainerStarted","Data":"70850a63ee572a1cb7a24417985f28fe3e03530001e6e622feefab46e40eb413"} Mar 18 16:46:59.101227 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:59.101231 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" event={"ID":"a8cfb9f0-cf5a-4924-9434-6edca1028003","Type":"ContainerStarted","Data":"f021e17421447eff27e29448c3a9a8010e74c59c676300a01bb34cb6886978ad"} Mar 18 16:46:59.101449 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:59.101431 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:46:59.102100 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:59.102080 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-png9k" event={"ID":"19740704-29b3-447d-9452-ddda50ca838f","Type":"ContainerStarted","Data":"84feb43705f6f1d5325942ec9bd9ec1ab38f04a4c72aa411b324669ca28f1a4f"} Mar 18 16:46:59.124529 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:46:59.124487 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" podStartSLOduration=161.124476603 podStartE2EDuration="2m41.124476603s" podCreationTimestamp="2026-03-18 16:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:46:59.123169635 +0000 UTC m=+162.142829903" watchObservedRunningTime="2026-03-18 16:46:59.124476603 +0000 UTC m=+162.144136869" Mar 18 16:47:01.108624 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:01.108547 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8dnhc" event={"ID":"55064175-dfab-4504-98ca-526cf33504e7","Type":"ContainerStarted","Data":"c517fab8efcd1c55e5bd00a23d70d0c7ebc995f6e135491f39d0e8b950a6ef5f"} Mar 18 16:47:01.110217 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:01.110189 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-png9k" event={"ID":"19740704-29b3-447d-9452-ddda50ca838f","Type":"ContainerStarted","Data":"b026562b9edaa211f895b42aed165a0e8dcf870978432113439323ef76b4e6e6"} Mar 18 16:47:01.110336 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:01.110222 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-png9k" event={"ID":"19740704-29b3-447d-9452-ddda50ca838f","Type":"ContainerStarted","Data":"818367b47b1b9ef1e774136df21d96171cce38a8d05941af0f14b853bc54cc61"} Mar 18 16:47:01.110717 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:01.110689 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-png9k" Mar 18 16:47:01.156293 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:01.156252 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-png9k" podStartSLOduration=129.05943773 podStartE2EDuration="2m11.15624039s" podCreationTimestamp="2026-03-18 16:44:50 +0000 UTC" firstStartedPulling="2026-03-18 16:46:58.759159816 +0000 UTC m=+161.778820061" lastFinishedPulling="2026-03-18 16:47:00.855962467 +0000 UTC m=+163.875622721" observedRunningTime="2026-03-18 16:47:01.155486933 +0000 UTC m=+164.175147199" watchObservedRunningTime="2026-03-18 16:47:01.15624039 +0000 UTC m=+164.175900656" Mar 18 16:47:01.156680 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:01.156655 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8dnhc" podStartSLOduration=129.038944989 podStartE2EDuration="2m11.156649651s" podCreationTimestamp="2026-03-18 16:44:50 +0000 UTC" firstStartedPulling="2026-03-18 16:46:58.737138423 +0000 UTC m=+161.756798671" lastFinishedPulling="2026-03-18 16:47:00.854843077 +0000 UTC m=+163.874503333" observedRunningTime="2026-03-18 16:47:01.133645699 +0000 UTC m=+164.153305966" watchObservedRunningTime="2026-03-18 16:47:01.156649651 +0000 UTC m=+164.176309918" Mar 18 16:47:06.124023 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:06.123989 2571 generic.go:358] "Generic (PLEG): container finished" podID="0de4261d-9a83-416f-929f-ee7c9b6a701c" containerID="05dce7adeb94040d10dfdbd5588efd1ff168227d6a4340847847db9d400bcdac" exitCode=255 Mar 18 16:47:06.124392 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:06.124049 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs" event={"ID":"0de4261d-9a83-416f-929f-ee7c9b6a701c","Type":"ContainerDied","Data":"05dce7adeb94040d10dfdbd5588efd1ff168227d6a4340847847db9d400bcdac"} Mar 18 16:47:06.124392 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:06.124380 2571 scope.go:117] "RemoveContainer" containerID="05dce7adeb94040d10dfdbd5588efd1ff168227d6a4340847847db9d400bcdac" Mar 18 16:47:06.631210 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:06.631172 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:47:07.127423 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:07.127387 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-796c8fd6c7-rk5cs" event={"ID":"0de4261d-9a83-416f-929f-ee7c9b6a701c","Type":"ContainerStarted","Data":"539566ab9f59453b21c32c64f30a6e95492f504463c30c48c75c0db0ee934d92"} Mar 18 16:47:08.130656 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:08.130616 2571 generic.go:358] "Generic (PLEG): container finished" podID="56cf01a2-4433-4ec4-9d33-a321d6ddccab" containerID="b13923098340073c4cef130375e27820909304782583eb44048333ea5717d2ce" exitCode=1 Mar 18 16:47:08.130995 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:08.130692 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" event={"ID":"56cf01a2-4433-4ec4-9d33-a321d6ddccab","Type":"ContainerDied","Data":"b13923098340073c4cef130375e27820909304782583eb44048333ea5717d2ce"} Mar 18 16:47:08.131054 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:08.131033 2571 scope.go:117] "RemoveContainer" containerID="b13923098340073c4cef130375e27820909304782583eb44048333ea5717d2ce" Mar 18 16:47:08.833653 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:08.833606 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:47:09.134784 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:09.134695 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" event={"ID":"56cf01a2-4433-4ec4-9d33-a321d6ddccab","Type":"ContainerStarted","Data":"ad4317ddc38d3bf7b267c202f70279aad3c9bdbc472ff9fe0fee0c8834a0bfa2"} Mar 18 16:47:09.135248 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:09.134919 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:47:09.135550 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:09.135533 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-566b78785d-tvjr9" Mar 18 16:47:10.625879 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.625847 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fqvd4"] Mar 18 16:47:10.628991 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.628971 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.631038 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.631013 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Mar 18 16:47:10.631157 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.631022 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Mar 18 16:47:10.632019 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.631989 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Mar 18 16:47:10.632107 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.632031 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Mar 18 16:47:10.632107 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.632069 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-h8gkt\"" Mar 18 16:47:10.643146 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.643119 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fqvd4"] Mar 18 16:47:10.661461 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.661429 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/df81206e-e9a8-44a0-a891-d6f01e5e71d8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fqvd4\" (UID: \"df81206e-e9a8-44a0-a891-d6f01e5e71d8\") " pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.661628 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.661477 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/df81206e-e9a8-44a0-a891-d6f01e5e71d8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fqvd4\" (UID: \"df81206e-e9a8-44a0-a891-d6f01e5e71d8\") " pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.661628 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.661582 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/df81206e-e9a8-44a0-a891-d6f01e5e71d8-crio-socket\") pod \"insights-runtime-extractor-fqvd4\" (UID: \"df81206e-e9a8-44a0-a891-d6f01e5e71d8\") " pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.661628 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.661608 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckqp7\" (UniqueName: \"kubernetes.io/projected/df81206e-e9a8-44a0-a891-d6f01e5e71d8-kube-api-access-ckqp7\") pod \"insights-runtime-extractor-fqvd4\" (UID: \"df81206e-e9a8-44a0-a891-d6f01e5e71d8\") " pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.661767 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.661705 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/df81206e-e9a8-44a0-a891-d6f01e5e71d8-data-volume\") pod \"insights-runtime-extractor-fqvd4\" (UID: \"df81206e-e9a8-44a0-a891-d6f01e5e71d8\") " pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.762419 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.762376 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/df81206e-e9a8-44a0-a891-d6f01e5e71d8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fqvd4\" (UID: \"df81206e-e9a8-44a0-a891-d6f01e5e71d8\") " pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.762419 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.762425 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/df81206e-e9a8-44a0-a891-d6f01e5e71d8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fqvd4\" (UID: \"df81206e-e9a8-44a0-a891-d6f01e5e71d8\") " pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.762672 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.762452 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/df81206e-e9a8-44a0-a891-d6f01e5e71d8-crio-socket\") pod \"insights-runtime-extractor-fqvd4\" (UID: \"df81206e-e9a8-44a0-a891-d6f01e5e71d8\") " pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.762672 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.762468 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckqp7\" (UniqueName: \"kubernetes.io/projected/df81206e-e9a8-44a0-a891-d6f01e5e71d8-kube-api-access-ckqp7\") pod \"insights-runtime-extractor-fqvd4\" (UID: \"df81206e-e9a8-44a0-a891-d6f01e5e71d8\") " pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.762672 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.762505 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/df81206e-e9a8-44a0-a891-d6f01e5e71d8-data-volume\") pod \"insights-runtime-extractor-fqvd4\" (UID: \"df81206e-e9a8-44a0-a891-d6f01e5e71d8\") " pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.762672 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.762584 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/df81206e-e9a8-44a0-a891-d6f01e5e71d8-crio-socket\") pod \"insights-runtime-extractor-fqvd4\" (UID: \"df81206e-e9a8-44a0-a891-d6f01e5e71d8\") " pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.762810 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.762800 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/df81206e-e9a8-44a0-a891-d6f01e5e71d8-data-volume\") pod \"insights-runtime-extractor-fqvd4\" (UID: \"df81206e-e9a8-44a0-a891-d6f01e5e71d8\") " pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.763050 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.763033 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/df81206e-e9a8-44a0-a891-d6f01e5e71d8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fqvd4\" (UID: \"df81206e-e9a8-44a0-a891-d6f01e5e71d8\") " pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.764875 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.764854 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/df81206e-e9a8-44a0-a891-d6f01e5e71d8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fqvd4\" (UID: \"df81206e-e9a8-44a0-a891-d6f01e5e71d8\") " pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.774485 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.774460 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckqp7\" (UniqueName: \"kubernetes.io/projected/df81206e-e9a8-44a0-a891-d6f01e5e71d8-kube-api-access-ckqp7\") pod \"insights-runtime-extractor-fqvd4\" (UID: \"df81206e-e9a8-44a0-a891-d6f01e5e71d8\") " pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:10.937676 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:10.937573 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fqvd4" Mar 18 16:47:11.056549 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:11.056515 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fqvd4"] Mar 18 16:47:11.060561 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:47:11.060523 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf81206e_e9a8_44a0_a891_d6f01e5e71d8.slice/crio-4ea1fd346b68670a64b4c712e02f10728d10ebe81d1700cc7050c94112cd4f26 WatchSource:0}: Error finding container 4ea1fd346b68670a64b4c712e02f10728d10ebe81d1700cc7050c94112cd4f26: Status 404 returned error can't find the container with id 4ea1fd346b68670a64b4c712e02f10728d10ebe81d1700cc7050c94112cd4f26 Mar 18 16:47:11.141389 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:11.141355 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fqvd4" event={"ID":"df81206e-e9a8-44a0-a891-d6f01e5e71d8","Type":"ContainerStarted","Data":"40fa50977c58d0b71b5d90be0829818c9e371f51522ed4429b75585f1eb76d4d"} Mar 18 16:47:11.141389 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:11.141400 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fqvd4" event={"ID":"df81206e-e9a8-44a0-a891-d6f01e5e71d8","Type":"ContainerStarted","Data":"4ea1fd346b68670a64b4c712e02f10728d10ebe81d1700cc7050c94112cd4f26"} Mar 18 16:47:12.117459 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:12.117367 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-png9k" Mar 18 16:47:12.146203 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:12.146168 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fqvd4" event={"ID":"df81206e-e9a8-44a0-a891-d6f01e5e71d8","Type":"ContainerStarted","Data":"73565e0ac1ba5970b3ccbcd5e3f6718cf7531f6fdff51d9266467f149f36b887"} Mar 18 16:47:14.153657 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:14.153614 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fqvd4" event={"ID":"df81206e-e9a8-44a0-a891-d6f01e5e71d8","Type":"ContainerStarted","Data":"83d7a9743728c22a90696e89357dbf9dbbc87658f100869fb6d65238629eadd5"} Mar 18 16:47:14.172072 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:14.172023 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fqvd4" podStartSLOduration=2.057901382 podStartE2EDuration="4.172009177s" podCreationTimestamp="2026-03-18 16:47:10 +0000 UTC" firstStartedPulling="2026-03-18 16:47:11.114665617 +0000 UTC m=+174.134325862" lastFinishedPulling="2026-03-18 16:47:13.22877341 +0000 UTC m=+176.248433657" observedRunningTime="2026-03-18 16:47:14.171129928 +0000 UTC m=+177.190790224" watchObservedRunningTime="2026-03-18 16:47:14.172009177 +0000 UTC m=+177.191669445" Mar 18 16:47:18.301527 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:18.301492 2571 patch_prober.go:28] interesting pod/image-registry-554f55bc5d-4v6vv container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Mar 18 16:47:18.301972 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:18.301545 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" podUID="a8cfb9f0-cf5a-4924-9434-6edca1028003" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:47:19.441632 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.441597 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-w2hjw"] Mar 18 16:47:19.445858 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.445838 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.449900 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.449879 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-775mt\"" Mar 18 16:47:19.450013 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.449882 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Mar 18 16:47:19.450080 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.449883 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Mar 18 16:47:19.450080 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.449898 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Mar 18 16:47:19.450186 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.449901 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Mar 18 16:47:19.450186 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.449958 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Mar 18 16:47:19.450787 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.450769 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Mar 18 16:47:19.525286 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.525259 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-root\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.525286 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.525292 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-node-exporter-textfile\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.525491 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.525308 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-node-exporter-wtmp\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.525491 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.525339 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-metrics-client-ca\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.525491 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.525386 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-node-exporter-tls\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.525491 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.525405 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.525491 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.525423 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-node-exporter-accelerators-collector-config\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.525491 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.525445 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nw7s\" (UniqueName: \"kubernetes.io/projected/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-kube-api-access-4nw7s\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.525491 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.525463 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-sys\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.626634 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.626605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-metrics-client-ca\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.626798 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.626647 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-node-exporter-tls\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.626798 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.626772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.626899 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.626836 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-node-exporter-accelerators-collector-config\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.626899 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.626866 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nw7s\" (UniqueName: \"kubernetes.io/projected/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-kube-api-access-4nw7s\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.626982 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.626898 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-sys\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.626982 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.626953 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-root\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.627083 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.626988 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-node-exporter-textfile\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.627083 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.627015 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-node-exporter-wtmp\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.627083 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.627021 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-sys\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.627227 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.627139 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-root\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.627227 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.627173 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-node-exporter-wtmp\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.627330 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.627290 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-metrics-client-ca\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.627445 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.627424 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-node-exporter-textfile\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.627529 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.627425 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-node-exporter-accelerators-collector-config\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.629138 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.629116 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-node-exporter-tls\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.629226 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.629171 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.641932 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.641907 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nw7s\" (UniqueName: \"kubernetes.io/projected/b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0-kube-api-access-4nw7s\") pod \"node-exporter-w2hjw\" (UID: \"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0\") " pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.754931 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:19.754842 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-w2hjw" Mar 18 16:47:19.762937 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:47:19.762907 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb35dc19b_c62a_4812_aa2d_2bf6fe09a5b0.slice/crio-812a7ab7998fcd188b4d73197e3c86383b72e4f70556e317fad3e5a758ef208c WatchSource:0}: Error finding container 812a7ab7998fcd188b4d73197e3c86383b72e4f70556e317fad3e5a758ef208c: Status 404 returned error can't find the container with id 812a7ab7998fcd188b4d73197e3c86383b72e4f70556e317fad3e5a758ef208c Mar 18 16:47:20.109307 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:20.109281 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:47:20.169498 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:20.169468 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w2hjw" event={"ID":"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0","Type":"ContainerStarted","Data":"812a7ab7998fcd188b4d73197e3c86383b72e4f70556e317fad3e5a758ef208c"} Mar 18 16:47:21.173741 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:21.173710 2571 generic.go:358] "Generic (PLEG): container finished" podID="b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0" containerID="f801fe6ecc294ae1392250e69ae1294e9170c41ee58940c9b127bdea32debca2" exitCode=0 Mar 18 16:47:21.174141 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:21.173759 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w2hjw" event={"ID":"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0","Type":"ContainerDied","Data":"f801fe6ecc294ae1392250e69ae1294e9170c41ee58940c9b127bdea32debca2"} Mar 18 16:47:22.178254 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:22.178218 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w2hjw" event={"ID":"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0","Type":"ContainerStarted","Data":"e1dc4d7e06ce962c2c08a8f26dcd69c642c3b1fd8f884d214b7136638361b043"} Mar 18 16:47:22.178254 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:22.178257 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w2hjw" event={"ID":"b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0","Type":"ContainerStarted","Data":"bfa574020439ac8fa2003bef6829c1ae090c22102cc86901f91cd6743d408c1c"} Mar 18 16:47:32.797585 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:32.797509 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-w2hjw" podStartSLOduration=13.063576708 podStartE2EDuration="13.797492417s" podCreationTimestamp="2026-03-18 16:47:19 +0000 UTC" firstStartedPulling="2026-03-18 16:47:19.764406758 +0000 UTC m=+182.784067006" lastFinishedPulling="2026-03-18 16:47:20.498322467 +0000 UTC m=+183.517982715" observedRunningTime="2026-03-18 16:47:22.259544925 +0000 UTC m=+185.279205203" watchObservedRunningTime="2026-03-18 16:47:32.797492417 +0000 UTC m=+195.817152684" Mar 18 16:47:32.798511 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:32.798489 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-554f55bc5d-4v6vv"] Mar 18 16:47:40.747584 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:40.747543 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" podUID="8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 16:47:50.747054 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:50.747009 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" podUID="8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 16:47:55.755116 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:55.755087 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w2hjw_b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0/init-textfile/0.log" Mar 18 16:47:55.950872 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:55.950845 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w2hjw_b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0/node-exporter/0.log" Mar 18 16:47:56.150147 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:56.150120 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w2hjw_b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0/kube-rbac-proxy/0.log" Mar 18 16:47:57.816715 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:57.816658 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" podUID="a8cfb9f0-cf5a-4924-9434-6edca1028003" containerName="registry" containerID="cri-o://70850a63ee572a1cb7a24417985f28fe3e03530001e6e622feefab46e40eb413" gracePeriod=30 Mar 18 16:47:58.079187 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.079163 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:47:58.134322 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.134281 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a8cfb9f0-cf5a-4924-9434-6edca1028003-image-registry-private-configuration\") pod \"a8cfb9f0-cf5a-4924-9434-6edca1028003\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " Mar 18 16:47:58.134518 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.134344 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-bound-sa-token\") pod \"a8cfb9f0-cf5a-4924-9434-6edca1028003\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " Mar 18 16:47:58.134518 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.134382 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8cfb9f0-cf5a-4924-9434-6edca1028003-installation-pull-secrets\") pod \"a8cfb9f0-cf5a-4924-9434-6edca1028003\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " Mar 18 16:47:58.134518 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.134416 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vcjm\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-kube-api-access-8vcjm\") pod \"a8cfb9f0-cf5a-4924-9434-6edca1028003\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " Mar 18 16:47:58.134518 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.134444 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls\") pod \"a8cfb9f0-cf5a-4924-9434-6edca1028003\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " Mar 18 16:47:58.134518 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.134479 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8cfb9f0-cf5a-4924-9434-6edca1028003-trusted-ca\") pod \"a8cfb9f0-cf5a-4924-9434-6edca1028003\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " Mar 18 16:47:58.134518 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.134501 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-certificates\") pod \"a8cfb9f0-cf5a-4924-9434-6edca1028003\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " Mar 18 16:47:58.134850 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.134530 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8cfb9f0-cf5a-4924-9434-6edca1028003-ca-trust-extracted\") pod \"a8cfb9f0-cf5a-4924-9434-6edca1028003\" (UID: \"a8cfb9f0-cf5a-4924-9434-6edca1028003\") " Mar 18 16:47:58.135132 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.135102 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8cfb9f0-cf5a-4924-9434-6edca1028003-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a8cfb9f0-cf5a-4924-9434-6edca1028003" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:47:58.135213 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.135162 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a8cfb9f0-cf5a-4924-9434-6edca1028003" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:47:58.136906 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.136850 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a8cfb9f0-cf5a-4924-9434-6edca1028003" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:47:58.137019 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.136937 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-kube-api-access-8vcjm" (OuterVolumeSpecName: "kube-api-access-8vcjm") pod "a8cfb9f0-cf5a-4924-9434-6edca1028003" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003"). InnerVolumeSpecName "kube-api-access-8vcjm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:47:58.137136 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.137109 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8cfb9f0-cf5a-4924-9434-6edca1028003-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a8cfb9f0-cf5a-4924-9434-6edca1028003" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:47:58.137224 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.137205 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8cfb9f0-cf5a-4924-9434-6edca1028003-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a8cfb9f0-cf5a-4924-9434-6edca1028003" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:47:58.137326 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.137307 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a8cfb9f0-cf5a-4924-9434-6edca1028003" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:47:58.143197 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.143168 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8cfb9f0-cf5a-4924-9434-6edca1028003-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a8cfb9f0-cf5a-4924-9434-6edca1028003" (UID: "a8cfb9f0-cf5a-4924-9434-6edca1028003"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:47:58.235241 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.235209 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8cfb9f0-cf5a-4924-9434-6edca1028003-installation-pull-secrets\") on node \"ip-10-0-137-219.ec2.internal\" DevicePath \"\"" Mar 18 16:47:58.235241 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.235238 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8vcjm\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-kube-api-access-8vcjm\") on node \"ip-10-0-137-219.ec2.internal\" DevicePath \"\"" Mar 18 16:47:58.235241 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.235249 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-tls\") on node \"ip-10-0-137-219.ec2.internal\" DevicePath \"\"" Mar 18 16:47:58.235447 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.235258 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8cfb9f0-cf5a-4924-9434-6edca1028003-trusted-ca\") on node \"ip-10-0-137-219.ec2.internal\" DevicePath \"\"" Mar 18 16:47:58.235447 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.235267 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8cfb9f0-cf5a-4924-9434-6edca1028003-registry-certificates\") on node \"ip-10-0-137-219.ec2.internal\" DevicePath \"\"" Mar 18 16:47:58.235447 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.235275 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8cfb9f0-cf5a-4924-9434-6edca1028003-ca-trust-extracted\") on node \"ip-10-0-137-219.ec2.internal\" DevicePath \"\"" Mar 18 16:47:58.235447 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.235285 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a8cfb9f0-cf5a-4924-9434-6edca1028003-image-registry-private-configuration\") on node \"ip-10-0-137-219.ec2.internal\" DevicePath \"\"" Mar 18 16:47:58.235447 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.235294 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8cfb9f0-cf5a-4924-9434-6edca1028003-bound-sa-token\") on node \"ip-10-0-137-219.ec2.internal\" DevicePath \"\"" Mar 18 16:47:58.267266 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.267228 2571 generic.go:358] "Generic (PLEG): container finished" podID="a8cfb9f0-cf5a-4924-9434-6edca1028003" containerID="70850a63ee572a1cb7a24417985f28fe3e03530001e6e622feefab46e40eb413" exitCode=0 Mar 18 16:47:58.267398 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.267274 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" event={"ID":"a8cfb9f0-cf5a-4924-9434-6edca1028003","Type":"ContainerDied","Data":"70850a63ee572a1cb7a24417985f28fe3e03530001e6e622feefab46e40eb413"} Mar 18 16:47:58.267398 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.267287 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" Mar 18 16:47:58.267398 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.267296 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-554f55bc5d-4v6vv" event={"ID":"a8cfb9f0-cf5a-4924-9434-6edca1028003","Type":"ContainerDied","Data":"f021e17421447eff27e29448c3a9a8010e74c59c676300a01bb34cb6886978ad"} Mar 18 16:47:58.267398 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.267311 2571 scope.go:117] "RemoveContainer" containerID="70850a63ee572a1cb7a24417985f28fe3e03530001e6e622feefab46e40eb413" Mar 18 16:47:58.276774 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.276756 2571 scope.go:117] "RemoveContainer" containerID="70850a63ee572a1cb7a24417985f28fe3e03530001e6e622feefab46e40eb413" Mar 18 16:47:58.277100 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:47:58.277079 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70850a63ee572a1cb7a24417985f28fe3e03530001e6e622feefab46e40eb413\": container with ID starting with 70850a63ee572a1cb7a24417985f28fe3e03530001e6e622feefab46e40eb413 not found: ID does not exist" containerID="70850a63ee572a1cb7a24417985f28fe3e03530001e6e622feefab46e40eb413" Mar 18 16:47:58.277166 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.277109 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70850a63ee572a1cb7a24417985f28fe3e03530001e6e622feefab46e40eb413"} err="failed to get container status \"70850a63ee572a1cb7a24417985f28fe3e03530001e6e622feefab46e40eb413\": rpc error: code = NotFound desc = could not find container \"70850a63ee572a1cb7a24417985f28fe3e03530001e6e622feefab46e40eb413\": container with ID starting with 70850a63ee572a1cb7a24417985f28fe3e03530001e6e622feefab46e40eb413 not found: ID does not exist" Mar 18 16:47:58.289627 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.289601 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-554f55bc5d-4v6vv"] Mar 18 16:47:58.292855 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:58.292832 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-554f55bc5d-4v6vv"] Mar 18 16:47:59.634059 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:47:59.634024 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8cfb9f0-cf5a-4924-9434-6edca1028003" path="/var/lib/kubelet/pods/a8cfb9f0-cf5a-4924-9434-6edca1028003/volumes" Mar 18 16:48:00.749141 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:00.747380 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" podUID="8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 16:48:00.749141 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:00.747488 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" Mar 18 16:48:00.749141 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:00.748346 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"d601ee6f6f87406fd6a650a9a402823b814c6183e192bbef9328281778f24aa4"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" containerMessage="Container service-proxy failed liveness probe, will be restarted" Mar 18 16:48:00.749141 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:00.748394 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" podUID="8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d" containerName="service-proxy" containerID="cri-o://d601ee6f6f87406fd6a650a9a402823b814c6183e192bbef9328281778f24aa4" gracePeriod=30 Mar 18 16:48:01.278261 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:01.278227 2571 generic.go:358] "Generic (PLEG): container finished" podID="8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d" containerID="d601ee6f6f87406fd6a650a9a402823b814c6183e192bbef9328281778f24aa4" exitCode=2 Mar 18 16:48:01.278411 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:01.278290 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" event={"ID":"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d","Type":"ContainerDied","Data":"d601ee6f6f87406fd6a650a9a402823b814c6183e192bbef9328281778f24aa4"} Mar 18 16:48:01.278411 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:01.278324 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c5965f9c5-wsbtk" event={"ID":"8013e8ca-ad3c-4c29-99f3-81fc6aa8ed3d","Type":"ContainerStarted","Data":"95a8fcea1d0bddaad66a512636481de5c53d99f131ca178da7fdecec4657727b"} Mar 18 16:48:02.149523 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:02.149488 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8dnhc_55064175-dfab-4504-98ca-526cf33504e7/serve-healthcheck-canary/0.log" Mar 18 16:48:29.353654 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:29.353571 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs\") pod \"network-metrics-daemon-mqv2m\" (UID: \"268f643d-a996-4e11-b14c-70264a1426da\") " pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:48:29.355866 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:29.355845 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/268f643d-a996-4e11-b14c-70264a1426da-metrics-certs\") pod \"network-metrics-daemon-mqv2m\" (UID: \"268f643d-a996-4e11-b14c-70264a1426da\") " pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:48:29.433736 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:29.433703 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-97tcn\"" Mar 18 16:48:29.442398 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:29.442376 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqv2m" Mar 18 16:48:29.556745 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:29.556712 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mqv2m"] Mar 18 16:48:29.560434 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:48:29.560405 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod268f643d_a996_4e11_b14c_70264a1426da.slice/crio-a1c1794a55278ebeebd509cf3acefc0a559f1c204c8af19ea6628c477f09a38b WatchSource:0}: Error finding container a1c1794a55278ebeebd509cf3acefc0a559f1c204c8af19ea6628c477f09a38b: Status 404 returned error can't find the container with id a1c1794a55278ebeebd509cf3acefc0a559f1c204c8af19ea6628c477f09a38b Mar 18 16:48:30.352112 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:30.352073 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mqv2m" event={"ID":"268f643d-a996-4e11-b14c-70264a1426da","Type":"ContainerStarted","Data":"a1c1794a55278ebeebd509cf3acefc0a559f1c204c8af19ea6628c477f09a38b"} Mar 18 16:48:31.356119 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:31.356084 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mqv2m" event={"ID":"268f643d-a996-4e11-b14c-70264a1426da","Type":"ContainerStarted","Data":"6bb7cf8a26b492e504b12f6f9f718f0e2f0752282b50d678d729d5272c691784"} Mar 18 16:48:31.356119 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:31.356125 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mqv2m" event={"ID":"268f643d-a996-4e11-b14c-70264a1426da","Type":"ContainerStarted","Data":"e214aff66c25e3ff7bb800baf4d861f0c5c09be378ea0cf7c97729de838299aa"} Mar 18 16:48:31.373085 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:48:31.373033 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mqv2m" podStartSLOduration=253.473818767 podStartE2EDuration="4m14.373017309s" podCreationTimestamp="2026-03-18 16:44:17 +0000 UTC" firstStartedPulling="2026-03-18 16:48:29.56545893 +0000 UTC m=+252.585119181" lastFinishedPulling="2026-03-18 16:48:30.464657478 +0000 UTC m=+253.484317723" observedRunningTime="2026-03-18 16:48:31.371293424 +0000 UTC m=+254.390953691" watchObservedRunningTime="2026-03-18 16:48:31.373017309 +0000 UTC m=+254.392677576" Mar 18 16:49:17.494154 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:49:17.494123 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 16:49:17.495757 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:49:17.495735 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 16:49:17.498678 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:49:17.498659 2571 kubelet.go:1628] "Image garbage collection succeeded" Mar 18 16:51:03.923149 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:03.923114 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt"] Mar 18 16:51:03.923573 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:03.923341 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8cfb9f0-cf5a-4924-9434-6edca1028003" containerName="registry" Mar 18 16:51:03.923573 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:03.923352 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8cfb9f0-cf5a-4924-9434-6edca1028003" containerName="registry" Mar 18 16:51:03.923573 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:03.923420 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8cfb9f0-cf5a-4924-9434-6edca1028003" containerName="registry" Mar 18 16:51:03.926158 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:03.926137 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt" Mar 18 16:51:03.928173 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:03.928148 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Mar 18 16:51:03.928283 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:03.928188 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Mar 18 16:51:03.928283 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:03.928250 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Mar 18 16:51:03.928743 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:03.928727 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-vgwz2\"" Mar 18 16:51:03.936216 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:03.936195 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt"] Mar 18 16:51:03.955396 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:03.955369 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-748c497bc-5kn6w"] Mar 18 16:51:03.958070 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:03.958054 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-748c497bc-5kn6w" Mar 18 16:51:03.959905 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:03.959885 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Mar 18 16:51:03.960035 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:03.959981 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-qwszx\"" Mar 18 16:51:03.965420 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:03.965396 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-748c497bc-5kn6w"] Mar 18 16:51:04.003870 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.003840 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9b61867-7be1-44e9-8de6-b40798d1055d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-fk8zt\" (UID: \"a9b61867-7be1-44e9-8de6-b40798d1055d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt" Mar 18 16:51:04.004043 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.003906 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh6ng\" (UniqueName: \"kubernetes.io/projected/a9b61867-7be1-44e9-8de6-b40798d1055d-kube-api-access-nh6ng\") pod \"llmisvc-controller-manager-68cc5db7c4-fk8zt\" (UID: \"a9b61867-7be1-44e9-8de6-b40798d1055d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt" Mar 18 16:51:04.104928 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.104894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nh6ng\" (UniqueName: \"kubernetes.io/projected/a9b61867-7be1-44e9-8de6-b40798d1055d-kube-api-access-nh6ng\") pod \"llmisvc-controller-manager-68cc5db7c4-fk8zt\" (UID: \"a9b61867-7be1-44e9-8de6-b40798d1055d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt" Mar 18 16:51:04.105046 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.104938 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9b61867-7be1-44e9-8de6-b40798d1055d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-fk8zt\" (UID: \"a9b61867-7be1-44e9-8de6-b40798d1055d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt" Mar 18 16:51:04.105046 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.104960 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrjg5\" (UniqueName: \"kubernetes.io/projected/85251014-67ea-4213-b89c-89a3afacfed5-kube-api-access-nrjg5\") pod \"seaweedfs-748c497bc-5kn6w\" (UID: \"85251014-67ea-4213-b89c-89a3afacfed5\") " pod="kserve/seaweedfs-748c497bc-5kn6w" Mar 18 16:51:04.105046 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.104979 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/85251014-67ea-4213-b89c-89a3afacfed5-data\") pod \"seaweedfs-748c497bc-5kn6w\" (UID: \"85251014-67ea-4213-b89c-89a3afacfed5\") " pod="kserve/seaweedfs-748c497bc-5kn6w" Mar 18 16:51:04.105157 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:51:04.105055 2571 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Mar 18 16:51:04.105157 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:51:04.105132 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9b61867-7be1-44e9-8de6-b40798d1055d-cert podName:a9b61867-7be1-44e9-8de6-b40798d1055d nodeName:}" failed. No retries permitted until 2026-03-18 16:51:04.605115827 +0000 UTC m=+407.624776072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9b61867-7be1-44e9-8de6-b40798d1055d-cert") pod "llmisvc-controller-manager-68cc5db7c4-fk8zt" (UID: "a9b61867-7be1-44e9-8de6-b40798d1055d") : secret "llmisvc-webhook-server-cert" not found Mar 18 16:51:04.115400 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.115366 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh6ng\" (UniqueName: \"kubernetes.io/projected/a9b61867-7be1-44e9-8de6-b40798d1055d-kube-api-access-nh6ng\") pod \"llmisvc-controller-manager-68cc5db7c4-fk8zt\" (UID: \"a9b61867-7be1-44e9-8de6-b40798d1055d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt" Mar 18 16:51:04.206333 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.206237 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrjg5\" (UniqueName: \"kubernetes.io/projected/85251014-67ea-4213-b89c-89a3afacfed5-kube-api-access-nrjg5\") pod \"seaweedfs-748c497bc-5kn6w\" (UID: \"85251014-67ea-4213-b89c-89a3afacfed5\") " pod="kserve/seaweedfs-748c497bc-5kn6w" Mar 18 16:51:04.206333 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.206298 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/85251014-67ea-4213-b89c-89a3afacfed5-data\") pod \"seaweedfs-748c497bc-5kn6w\" (UID: \"85251014-67ea-4213-b89c-89a3afacfed5\") " pod="kserve/seaweedfs-748c497bc-5kn6w" Mar 18 16:51:04.206783 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.206742 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/85251014-67ea-4213-b89c-89a3afacfed5-data\") pod \"seaweedfs-748c497bc-5kn6w\" (UID: \"85251014-67ea-4213-b89c-89a3afacfed5\") " pod="kserve/seaweedfs-748c497bc-5kn6w" Mar 18 16:51:04.214715 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.214690 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrjg5\" (UniqueName: \"kubernetes.io/projected/85251014-67ea-4213-b89c-89a3afacfed5-kube-api-access-nrjg5\") pod \"seaweedfs-748c497bc-5kn6w\" (UID: \"85251014-67ea-4213-b89c-89a3afacfed5\") " pod="kserve/seaweedfs-748c497bc-5kn6w" Mar 18 16:51:04.267928 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.267895 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-748c497bc-5kn6w" Mar 18 16:51:04.401020 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.400995 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-748c497bc-5kn6w"] Mar 18 16:51:04.402343 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:51:04.402312 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85251014_67ea_4213_b89c_89a3afacfed5.slice/crio-f0559157d624fc17bc9ab71f3d887e22f608f7086491e6456f831963730a6364 WatchSource:0}: Error finding container f0559157d624fc17bc9ab71f3d887e22f608f7086491e6456f831963730a6364: Status 404 returned error can't find the container with id f0559157d624fc17bc9ab71f3d887e22f608f7086491e6456f831963730a6364 Mar 18 16:51:04.403667 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.403646 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:51:04.608596 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.608556 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9b61867-7be1-44e9-8de6-b40798d1055d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-fk8zt\" (UID: \"a9b61867-7be1-44e9-8de6-b40798d1055d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt" Mar 18 16:51:04.610938 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.610920 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9b61867-7be1-44e9-8de6-b40798d1055d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-fk8zt\" (UID: \"a9b61867-7be1-44e9-8de6-b40798d1055d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt" Mar 18 16:51:04.738912 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.738873 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-748c497bc-5kn6w" event={"ID":"85251014-67ea-4213-b89c-89a3afacfed5","Type":"ContainerStarted","Data":"f0559157d624fc17bc9ab71f3d887e22f608f7086491e6456f831963730a6364"} Mar 18 16:51:04.835719 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:04.835684 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt" Mar 18 16:51:05.075180 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:05.075128 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt"] Mar 18 16:51:05.078811 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:51:05.078771 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda9b61867_7be1_44e9_8de6_b40798d1055d.slice/crio-c04c0b41ae8de52ec2266e26ef5954ef0b76456c3218af9ce91f61ef0eabb01d WatchSource:0}: Error finding container c04c0b41ae8de52ec2266e26ef5954ef0b76456c3218af9ce91f61ef0eabb01d: Status 404 returned error can't find the container with id c04c0b41ae8de52ec2266e26ef5954ef0b76456c3218af9ce91f61ef0eabb01d Mar 18 16:51:05.355646 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:05.355614 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-wnx8j"] Mar 18 16:51:05.358953 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:05.358931 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-wnx8j" Mar 18 16:51:05.361584 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:05.361529 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Mar 18 16:51:05.361584 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:05.361567 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-q7g85\"" Mar 18 16:51:05.365946 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:05.365919 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-wnx8j"] Mar 18 16:51:05.514850 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:05.514797 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxj99\" (UniqueName: \"kubernetes.io/projected/6c069101-5600-4887-a294-751b45c6512d-kube-api-access-pxj99\") pod \"kserve-controller-manager-69d7c9bbdc-wnx8j\" (UID: \"6c069101-5600-4887-a294-751b45c6512d\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-wnx8j" Mar 18 16:51:05.515037 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:05.514912 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c069101-5600-4887-a294-751b45c6512d-cert\") pod \"kserve-controller-manager-69d7c9bbdc-wnx8j\" (UID: \"6c069101-5600-4887-a294-751b45c6512d\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-wnx8j" Mar 18 16:51:05.615849 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:05.615750 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxj99\" (UniqueName: \"kubernetes.io/projected/6c069101-5600-4887-a294-751b45c6512d-kube-api-access-pxj99\") pod \"kserve-controller-manager-69d7c9bbdc-wnx8j\" (UID: \"6c069101-5600-4887-a294-751b45c6512d\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-wnx8j" Mar 18 16:51:05.615849 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:05.615808 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c069101-5600-4887-a294-751b45c6512d-cert\") pod \"kserve-controller-manager-69d7c9bbdc-wnx8j\" (UID: \"6c069101-5600-4887-a294-751b45c6512d\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-wnx8j" Mar 18 16:51:05.618271 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:05.618248 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c069101-5600-4887-a294-751b45c6512d-cert\") pod \"kserve-controller-manager-69d7c9bbdc-wnx8j\" (UID: \"6c069101-5600-4887-a294-751b45c6512d\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-wnx8j" Mar 18 16:51:05.624329 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:05.624297 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxj99\" (UniqueName: \"kubernetes.io/projected/6c069101-5600-4887-a294-751b45c6512d-kube-api-access-pxj99\") pod \"kserve-controller-manager-69d7c9bbdc-wnx8j\" (UID: \"6c069101-5600-4887-a294-751b45c6512d\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-wnx8j" Mar 18 16:51:05.673210 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:05.673178 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-wnx8j" Mar 18 16:51:05.742903 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:05.742862 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt" event={"ID":"a9b61867-7be1-44e9-8de6-b40798d1055d","Type":"ContainerStarted","Data":"c04c0b41ae8de52ec2266e26ef5954ef0b76456c3218af9ce91f61ef0eabb01d"} Mar 18 16:51:06.106378 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:06.106342 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-wnx8j"] Mar 18 16:51:06.167897 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:51:06.167858 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c069101_5600_4887_a294_751b45c6512d.slice/crio-ee50bc9e2e79123db6e00c4943001c2f09768fee77412ab552be19f4e52c145e WatchSource:0}: Error finding container ee50bc9e2e79123db6e00c4943001c2f09768fee77412ab552be19f4e52c145e: Status 404 returned error can't find the container with id ee50bc9e2e79123db6e00c4943001c2f09768fee77412ab552be19f4e52c145e Mar 18 16:51:06.747466 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:06.747424 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-wnx8j" event={"ID":"6c069101-5600-4887-a294-751b45c6512d","Type":"ContainerStarted","Data":"ee50bc9e2e79123db6e00c4943001c2f09768fee77412ab552be19f4e52c145e"} Mar 18 16:51:09.756882 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:09.756855 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt" event={"ID":"a9b61867-7be1-44e9-8de6-b40798d1055d","Type":"ContainerStarted","Data":"0eb0bae861eac36ddd1374cd91cf106b4b6f66922b7a5c774cfa4c22fdd2df63"} Mar 18 16:51:09.757252 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:09.756938 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt" Mar 18 16:51:09.758076 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:09.758054 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-wnx8j" event={"ID":"6c069101-5600-4887-a294-751b45c6512d","Type":"ContainerStarted","Data":"14cb24b428c3d9a916b4c6821f63f7f35c70c8494993b8be5983339b8f378248"} Mar 18 16:51:09.758202 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:09.758188 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-69d7c9bbdc-wnx8j" Mar 18 16:51:09.773450 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:09.773403 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt" podStartSLOduration=2.200766779 podStartE2EDuration="6.773387986s" podCreationTimestamp="2026-03-18 16:51:03 +0000 UTC" firstStartedPulling="2026-03-18 16:51:05.080219341 +0000 UTC m=+408.099879586" lastFinishedPulling="2026-03-18 16:51:09.652840541 +0000 UTC m=+412.672500793" observedRunningTime="2026-03-18 16:51:09.771974102 +0000 UTC m=+412.791634369" watchObservedRunningTime="2026-03-18 16:51:09.773387986 +0000 UTC m=+412.793048310" Mar 18 16:51:09.785591 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:09.785542 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-69d7c9bbdc-wnx8j" podStartSLOduration=1.303060654 podStartE2EDuration="4.78552418s" podCreationTimestamp="2026-03-18 16:51:05 +0000 UTC" firstStartedPulling="2026-03-18 16:51:06.170282292 +0000 UTC m=+409.189942542" lastFinishedPulling="2026-03-18 16:51:09.652745819 +0000 UTC m=+412.672406068" observedRunningTime="2026-03-18 16:51:09.785232251 +0000 UTC m=+412.804892518" watchObservedRunningTime="2026-03-18 16:51:09.78552418 +0000 UTC m=+412.805184444" Mar 18 16:51:10.761956 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:10.761921 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-748c497bc-5kn6w" event={"ID":"85251014-67ea-4213-b89c-89a3afacfed5","Type":"ContainerStarted","Data":"de6411691d89f15834f4cdd78a1f09b633b149e124b2083304362a395f813939"} Mar 18 16:51:10.777427 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:10.777362 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-748c497bc-5kn6w" podStartSLOduration=2.473234384 podStartE2EDuration="7.777344558s" podCreationTimestamp="2026-03-18 16:51:03 +0000 UTC" firstStartedPulling="2026-03-18 16:51:04.403853075 +0000 UTC m=+407.423513324" lastFinishedPulling="2026-03-18 16:51:09.707963236 +0000 UTC m=+412.727623498" observedRunningTime="2026-03-18 16:51:10.776320258 +0000 UTC m=+413.795980525" watchObservedRunningTime="2026-03-18 16:51:10.777344558 +0000 UTC m=+413.797004826" Mar 18 16:51:40.763761 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:40.763665 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fk8zt" Mar 18 16:51:40.766796 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:51:40.766773 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-69d7c9bbdc-wnx8j" Mar 18 16:52:09.046543 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:09.046514 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz"] Mar 18 16:52:09.049451 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:09.049435 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" Mar 18 16:52:09.051426 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:09.051409 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Mar 18 16:52:09.057653 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:09.057631 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz"] Mar 18 16:52:09.142012 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:09.141987 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/18fc8710-f867-40a1-b662-f60e7caab690-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-q9qhz\" (UID: \"18fc8710-f867-40a1-b662-f60e7caab690\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" Mar 18 16:52:09.142119 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:09.142044 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgs9z\" (UniqueName: \"kubernetes.io/projected/18fc8710-f867-40a1-b662-f60e7caab690-kube-api-access-dgs9z\") pod \"seaweedfs-tls-custom-ddd4dbfd-q9qhz\" (UID: \"18fc8710-f867-40a1-b662-f60e7caab690\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" Mar 18 16:52:09.242653 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:09.242630 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgs9z\" (UniqueName: \"kubernetes.io/projected/18fc8710-f867-40a1-b662-f60e7caab690-kube-api-access-dgs9z\") pod \"seaweedfs-tls-custom-ddd4dbfd-q9qhz\" (UID: \"18fc8710-f867-40a1-b662-f60e7caab690\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" Mar 18 16:52:09.242739 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:09.242672 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/18fc8710-f867-40a1-b662-f60e7caab690-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-q9qhz\" (UID: \"18fc8710-f867-40a1-b662-f60e7caab690\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" Mar 18 16:52:09.243000 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:09.242985 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/18fc8710-f867-40a1-b662-f60e7caab690-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-q9qhz\" (UID: \"18fc8710-f867-40a1-b662-f60e7caab690\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" Mar 18 16:52:09.250233 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:09.250213 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgs9z\" (UniqueName: \"kubernetes.io/projected/18fc8710-f867-40a1-b662-f60e7caab690-kube-api-access-dgs9z\") pod \"seaweedfs-tls-custom-ddd4dbfd-q9qhz\" (UID: \"18fc8710-f867-40a1-b662-f60e7caab690\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" Mar 18 16:52:09.358266 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:09.358211 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" Mar 18 16:52:09.471236 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:09.471207 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz"] Mar 18 16:52:09.474185 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:52:09.474154 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18fc8710_f867_40a1_b662_f60e7caab690.slice/crio-6bd40d09ea78ce2f6e58fac2e7dbbd46d416b23507bf48246f0950cf4ea3e235 WatchSource:0}: Error finding container 6bd40d09ea78ce2f6e58fac2e7dbbd46d416b23507bf48246f0950cf4ea3e235: Status 404 returned error can't find the container with id 6bd40d09ea78ce2f6e58fac2e7dbbd46d416b23507bf48246f0950cf4ea3e235 Mar 18 16:52:09.911660 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:09.911622 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" event={"ID":"18fc8710-f867-40a1-b662-f60e7caab690","Type":"ContainerStarted","Data":"7aa0e4b20757f1661221c8242b4676a8bb5bccc4f6c6a4626ca18a0001aa5d78"} Mar 18 16:52:09.911660 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:09.911665 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" event={"ID":"18fc8710-f867-40a1-b662-f60e7caab690","Type":"ContainerStarted","Data":"6bd40d09ea78ce2f6e58fac2e7dbbd46d416b23507bf48246f0950cf4ea3e235"} Mar 18 16:52:09.926548 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:09.926497 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" podStartSLOduration=0.64866966 podStartE2EDuration="926.479682ms" podCreationTimestamp="2026-03-18 16:52:09 +0000 UTC" firstStartedPulling="2026-03-18 16:52:09.475405816 +0000 UTC m=+472.495066062" lastFinishedPulling="2026-03-18 16:52:09.753215835 +0000 UTC m=+472.772876084" observedRunningTime="2026-03-18 16:52:09.924939341 +0000 UTC m=+472.944599628" watchObservedRunningTime="2026-03-18 16:52:09.926479682 +0000 UTC m=+472.946139951" Mar 18 16:52:12.304913 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:12.304874 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz"] Mar 18 16:52:12.305371 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:12.305117 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" podUID="18fc8710-f867-40a1-b662-f60e7caab690" containerName="seaweedfs-tls-custom" containerID="cri-o://7aa0e4b20757f1661221c8242b4676a8bb5bccc4f6c6a4626ca18a0001aa5d78" gracePeriod=30 Mar 18 16:52:13.542792 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.542774 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" Mar 18 16:52:13.570186 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.570121 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgs9z\" (UniqueName: \"kubernetes.io/projected/18fc8710-f867-40a1-b662-f60e7caab690-kube-api-access-dgs9z\") pod \"18fc8710-f867-40a1-b662-f60e7caab690\" (UID: \"18fc8710-f867-40a1-b662-f60e7caab690\") " Mar 18 16:52:13.570186 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.570173 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/18fc8710-f867-40a1-b662-f60e7caab690-data\") pod \"18fc8710-f867-40a1-b662-f60e7caab690\" (UID: \"18fc8710-f867-40a1-b662-f60e7caab690\") " Mar 18 16:52:13.571448 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.571420 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18fc8710-f867-40a1-b662-f60e7caab690-data" (OuterVolumeSpecName: "data") pod "18fc8710-f867-40a1-b662-f60e7caab690" (UID: "18fc8710-f867-40a1-b662-f60e7caab690"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:52:13.572190 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.572163 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18fc8710-f867-40a1-b662-f60e7caab690-kube-api-access-dgs9z" (OuterVolumeSpecName: "kube-api-access-dgs9z") pod "18fc8710-f867-40a1-b662-f60e7caab690" (UID: "18fc8710-f867-40a1-b662-f60e7caab690"). InnerVolumeSpecName "kube-api-access-dgs9z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:52:13.671054 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.671029 2571 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/18fc8710-f867-40a1-b662-f60e7caab690-data\") on node \"ip-10-0-137-219.ec2.internal\" DevicePath \"\"" Mar 18 16:52:13.671054 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.671053 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dgs9z\" (UniqueName: \"kubernetes.io/projected/18fc8710-f867-40a1-b662-f60e7caab690-kube-api-access-dgs9z\") on node \"ip-10-0-137-219.ec2.internal\" DevicePath \"\"" Mar 18 16:52:13.922355 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.922276 2571 generic.go:358] "Generic (PLEG): container finished" podID="18fc8710-f867-40a1-b662-f60e7caab690" containerID="7aa0e4b20757f1661221c8242b4676a8bb5bccc4f6c6a4626ca18a0001aa5d78" exitCode=0 Mar 18 16:52:13.922355 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.922334 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" Mar 18 16:52:13.922521 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.922357 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" event={"ID":"18fc8710-f867-40a1-b662-f60e7caab690","Type":"ContainerDied","Data":"7aa0e4b20757f1661221c8242b4676a8bb5bccc4f6c6a4626ca18a0001aa5d78"} Mar 18 16:52:13.922521 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.922391 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz" event={"ID":"18fc8710-f867-40a1-b662-f60e7caab690","Type":"ContainerDied","Data":"6bd40d09ea78ce2f6e58fac2e7dbbd46d416b23507bf48246f0950cf4ea3e235"} Mar 18 16:52:13.922521 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.922406 2571 scope.go:117] "RemoveContainer" containerID="7aa0e4b20757f1661221c8242b4676a8bb5bccc4f6c6a4626ca18a0001aa5d78" Mar 18 16:52:13.930970 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.930954 2571 scope.go:117] "RemoveContainer" containerID="7aa0e4b20757f1661221c8242b4676a8bb5bccc4f6c6a4626ca18a0001aa5d78" Mar 18 16:52:13.931194 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:52:13.931178 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aa0e4b20757f1661221c8242b4676a8bb5bccc4f6c6a4626ca18a0001aa5d78\": container with ID starting with 7aa0e4b20757f1661221c8242b4676a8bb5bccc4f6c6a4626ca18a0001aa5d78 not found: ID does not exist" containerID="7aa0e4b20757f1661221c8242b4676a8bb5bccc4f6c6a4626ca18a0001aa5d78" Mar 18 16:52:13.931259 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.931202 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa0e4b20757f1661221c8242b4676a8bb5bccc4f6c6a4626ca18a0001aa5d78"} err="failed to get container status \"7aa0e4b20757f1661221c8242b4676a8bb5bccc4f6c6a4626ca18a0001aa5d78\": rpc error: code = NotFound desc = could not find container \"7aa0e4b20757f1661221c8242b4676a8bb5bccc4f6c6a4626ca18a0001aa5d78\": container with ID starting with 7aa0e4b20757f1661221c8242b4676a8bb5bccc4f6c6a4626ca18a0001aa5d78 not found: ID does not exist" Mar 18 16:52:13.936564 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.936543 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz"] Mar 18 16:52:13.939684 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:13.939666 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-q9qhz"] Mar 18 16:52:15.634320 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:52:15.634288 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18fc8710-f867-40a1-b662-f60e7caab690" path="/var/lib/kubelet/pods/18fc8710-f867-40a1-b662-f60e7caab690/volumes" Mar 18 16:54:17.512389 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:54:17.512362 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 16:54:17.512875 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:54:17.512725 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 16:56:03.465750 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:03.465666 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm"] Mar 18 16:56:03.468064 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:03.466081 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18fc8710-f867-40a1-b662-f60e7caab690" containerName="seaweedfs-tls-custom" Mar 18 16:56:03.468064 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:03.466099 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="18fc8710-f867-40a1-b662-f60e7caab690" containerName="seaweedfs-tls-custom" Mar 18 16:56:03.468064 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:03.466171 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="18fc8710-f867-40a1-b662-f60e7caab690" containerName="seaweedfs-tls-custom" Mar 18 16:56:03.468976 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:03.468955 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm" Mar 18 16:56:03.471048 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:03.471024 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-9lwwn\"" Mar 18 16:56:03.471173 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:03.471157 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Mar 18 16:56:03.471545 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:03.471531 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Mar 18 16:56:03.477270 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:03.477247 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm"] Mar 18 16:56:03.650360 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:03.650323 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j5nz\" (UniqueName: \"kubernetes.io/projected/0445a333-b8a2-4be9-8b87-6928dec241f8-kube-api-access-6j5nz\") pod \"message-dumper-predictor-558b4487d6-c5kwm\" (UID: \"0445a333-b8a2-4be9-8b87-6928dec241f8\") " pod="kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm" Mar 18 16:56:03.751750 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:03.751661 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j5nz\" (UniqueName: \"kubernetes.io/projected/0445a333-b8a2-4be9-8b87-6928dec241f8-kube-api-access-6j5nz\") pod \"message-dumper-predictor-558b4487d6-c5kwm\" (UID: \"0445a333-b8a2-4be9-8b87-6928dec241f8\") " pod="kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm" Mar 18 16:56:03.759886 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:03.759857 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j5nz\" (UniqueName: \"kubernetes.io/projected/0445a333-b8a2-4be9-8b87-6928dec241f8-kube-api-access-6j5nz\") pod \"message-dumper-predictor-558b4487d6-c5kwm\" (UID: \"0445a333-b8a2-4be9-8b87-6928dec241f8\") " pod="kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm" Mar 18 16:56:03.778837 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:03.778781 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm" Mar 18 16:56:03.897646 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:03.897612 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm"] Mar 18 16:56:03.901458 ip-10-0-137-219 kubenswrapper[2571]: W0318 16:56:03.901422 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0445a333_b8a2_4be9_8b87_6928dec241f8.slice/crio-aa69ce3391441f665e5cd6ac35b4806a553e05ba59085a1ab2e583c6ddec1d2e WatchSource:0}: Error finding container aa69ce3391441f665e5cd6ac35b4806a553e05ba59085a1ab2e583c6ddec1d2e: Status 404 returned error can't find the container with id aa69ce3391441f665e5cd6ac35b4806a553e05ba59085a1ab2e583c6ddec1d2e Mar 18 16:56:04.529901 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:04.529865 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm" event={"ID":"0445a333-b8a2-4be9-8b87-6928dec241f8","Type":"ContainerStarted","Data":"aa69ce3391441f665e5cd6ac35b4806a553e05ba59085a1ab2e583c6ddec1d2e"} Mar 18 16:56:05.534593 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:05.534559 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm" event={"ID":"0445a333-b8a2-4be9-8b87-6928dec241f8","Type":"ContainerStarted","Data":"a3f178ffb8202a9b0968c140f924f48c6dad25df0ece404b3d0058f46275df17"} Mar 18 16:56:05.534999 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:05.534798 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm" Mar 18 16:56:05.536587 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:05.536567 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm" Mar 18 16:56:05.553700 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:56:05.553649 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm" podStartSLOduration=1.555200271 podStartE2EDuration="2.553635193s" podCreationTimestamp="2026-03-18 16:56:03 +0000 UTC" firstStartedPulling="2026-03-18 16:56:03.903208124 +0000 UTC m=+706.922868370" lastFinishedPulling="2026-03-18 16:56:04.901643044 +0000 UTC m=+707.921303292" observedRunningTime="2026-03-18 16:56:05.552910679 +0000 UTC m=+708.572570958" watchObservedRunningTime="2026-03-18 16:56:05.553635193 +0000 UTC m=+708.573295459" Mar 18 16:57:58.269414 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:58.269385 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-558b4487d6-c5kwm_0445a333-b8a2-4be9-8b87-6928dec241f8/kserve-container/0.log" Mar 18 16:57:58.868698 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:58.868665 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm"] Mar 18 16:57:58.868988 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:58.868942 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm" podUID="0445a333-b8a2-4be9-8b87-6928dec241f8" containerName="kserve-container" containerID="cri-o://a3f178ffb8202a9b0968c140f924f48c6dad25df0ece404b3d0058f46275df17" gracePeriod=30 Mar 18 16:57:59.100883 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:59.100862 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm" Mar 18 16:57:59.270334 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:59.270232 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j5nz\" (UniqueName: \"kubernetes.io/projected/0445a333-b8a2-4be9-8b87-6928dec241f8-kube-api-access-6j5nz\") pod \"0445a333-b8a2-4be9-8b87-6928dec241f8\" (UID: \"0445a333-b8a2-4be9-8b87-6928dec241f8\") " Mar 18 16:57:59.272388 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:59.272354 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0445a333-b8a2-4be9-8b87-6928dec241f8-kube-api-access-6j5nz" (OuterVolumeSpecName: "kube-api-access-6j5nz") pod "0445a333-b8a2-4be9-8b87-6928dec241f8" (UID: "0445a333-b8a2-4be9-8b87-6928dec241f8"). InnerVolumeSpecName "kube-api-access-6j5nz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:57:59.371365 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:59.371329 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6j5nz\" (UniqueName: \"kubernetes.io/projected/0445a333-b8a2-4be9-8b87-6928dec241f8-kube-api-access-6j5nz\") on node \"ip-10-0-137-219.ec2.internal\" DevicePath \"\"" Mar 18 16:57:59.834549 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:59.834507 2571 generic.go:358] "Generic (PLEG): container finished" podID="0445a333-b8a2-4be9-8b87-6928dec241f8" containerID="a3f178ffb8202a9b0968c140f924f48c6dad25df0ece404b3d0058f46275df17" exitCode=2 Mar 18 16:57:59.834549 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:59.834552 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm" event={"ID":"0445a333-b8a2-4be9-8b87-6928dec241f8","Type":"ContainerDied","Data":"a3f178ffb8202a9b0968c140f924f48c6dad25df0ece404b3d0058f46275df17"} Mar 18 16:57:59.834765 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:59.834566 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm" Mar 18 16:57:59.834765 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:59.834578 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm" event={"ID":"0445a333-b8a2-4be9-8b87-6928dec241f8","Type":"ContainerDied","Data":"aa69ce3391441f665e5cd6ac35b4806a553e05ba59085a1ab2e583c6ddec1d2e"} Mar 18 16:57:59.834765 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:59.834596 2571 scope.go:117] "RemoveContainer" containerID="a3f178ffb8202a9b0968c140f924f48c6dad25df0ece404b3d0058f46275df17" Mar 18 16:57:59.842286 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:59.842266 2571 scope.go:117] "RemoveContainer" containerID="a3f178ffb8202a9b0968c140f924f48c6dad25df0ece404b3d0058f46275df17" Mar 18 16:57:59.842542 ip-10-0-137-219 kubenswrapper[2571]: E0318 16:57:59.842524 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3f178ffb8202a9b0968c140f924f48c6dad25df0ece404b3d0058f46275df17\": container with ID starting with a3f178ffb8202a9b0968c140f924f48c6dad25df0ece404b3d0058f46275df17 not found: ID does not exist" containerID="a3f178ffb8202a9b0968c140f924f48c6dad25df0ece404b3d0058f46275df17" Mar 18 16:57:59.842589 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:59.842548 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3f178ffb8202a9b0968c140f924f48c6dad25df0ece404b3d0058f46275df17"} err="failed to get container status \"a3f178ffb8202a9b0968c140f924f48c6dad25df0ece404b3d0058f46275df17\": rpc error: code = NotFound desc = could not find container \"a3f178ffb8202a9b0968c140f924f48c6dad25df0ece404b3d0058f46275df17\": container with ID starting with a3f178ffb8202a9b0968c140f924f48c6dad25df0ece404b3d0058f46275df17 not found: ID does not exist" Mar 18 16:57:59.847739 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:59.847710 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm"] Mar 18 16:57:59.849149 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:57:59.849128 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-558b4487d6-c5kwm"] Mar 18 16:58:01.634061 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:58:01.634023 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0445a333-b8a2-4be9-8b87-6928dec241f8" path="/var/lib/kubelet/pods/0445a333-b8a2-4be9-8b87-6928dec241f8/volumes" Mar 18 16:59:17.530742 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:59:17.530718 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 16:59:17.531522 ip-10-0-137-219 kubenswrapper[2571]: I0318 16:59:17.531502 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:04:17.546964 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:04:17.546930 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:04:17.548279 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:04:17.548255 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:09:17.563030 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:09:17.563000 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:09:17.565571 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:09:17.565552 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:14:17.579778 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:14:17.579750 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:14:17.585000 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:14:17.584974 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:19:17.598951 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:19:17.598920 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:19:17.602216 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:19:17.602196 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:24:17.618880 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:24:17.618852 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:24:17.622502 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:24:17.622483 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:29:17.636531 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:29:17.636501 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:29:17.640432 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:29:17.640409 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:34:17.652706 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:34:17.652678 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:34:17.656496 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:34:17.656479 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:39:17.671301 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:39:17.671274 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:39:17.674324 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:39:17.674303 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:44:17.687360 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:44:17.687334 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:44:17.690884 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:44:17.690862 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:46:38.951600 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:38.951570 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vxxd9/must-gather-bzrnn"] Mar 18 17:46:38.952134 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:38.951807 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0445a333-b8a2-4be9-8b87-6928dec241f8" containerName="kserve-container" Mar 18 17:46:38.952134 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:38.951832 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0445a333-b8a2-4be9-8b87-6928dec241f8" containerName="kserve-container" Mar 18 17:46:38.952134 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:38.951872 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0445a333-b8a2-4be9-8b87-6928dec241f8" containerName="kserve-container" Mar 18 17:46:38.954627 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:38.954613 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxxd9/must-gather-bzrnn" Mar 18 17:46:38.956439 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:38.956413 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vxxd9\"/\"openshift-service-ca.crt\"" Mar 18 17:46:38.956567 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:38.956554 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vxxd9\"/\"kube-root-ca.crt\"" Mar 18 17:46:38.956637 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:38.956607 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vxxd9\"/\"default-dockercfg-w25nj\"" Mar 18 17:46:38.963054 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:38.963033 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vxxd9/must-gather-bzrnn"] Mar 18 17:46:39.108880 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:39.108854 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/732fc7b7-286b-451d-9298-6fa09463c374-must-gather-output\") pod \"must-gather-bzrnn\" (UID: \"732fc7b7-286b-451d-9298-6fa09463c374\") " pod="openshift-must-gather-vxxd9/must-gather-bzrnn" Mar 18 17:46:39.109012 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:39.108892 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5zcl\" (UniqueName: \"kubernetes.io/projected/732fc7b7-286b-451d-9298-6fa09463c374-kube-api-access-t5zcl\") pod \"must-gather-bzrnn\" (UID: \"732fc7b7-286b-451d-9298-6fa09463c374\") " pod="openshift-must-gather-vxxd9/must-gather-bzrnn" Mar 18 17:46:39.209933 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:39.209857 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/732fc7b7-286b-451d-9298-6fa09463c374-must-gather-output\") pod \"must-gather-bzrnn\" (UID: \"732fc7b7-286b-451d-9298-6fa09463c374\") " pod="openshift-must-gather-vxxd9/must-gather-bzrnn" Mar 18 17:46:39.209933 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:39.209899 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5zcl\" (UniqueName: \"kubernetes.io/projected/732fc7b7-286b-451d-9298-6fa09463c374-kube-api-access-t5zcl\") pod \"must-gather-bzrnn\" (UID: \"732fc7b7-286b-451d-9298-6fa09463c374\") " pod="openshift-must-gather-vxxd9/must-gather-bzrnn" Mar 18 17:46:39.210202 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:39.210181 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/732fc7b7-286b-451d-9298-6fa09463c374-must-gather-output\") pod \"must-gather-bzrnn\" (UID: \"732fc7b7-286b-451d-9298-6fa09463c374\") " pod="openshift-must-gather-vxxd9/must-gather-bzrnn" Mar 18 17:46:39.218145 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:39.218118 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5zcl\" (UniqueName: \"kubernetes.io/projected/732fc7b7-286b-451d-9298-6fa09463c374-kube-api-access-t5zcl\") pod \"must-gather-bzrnn\" (UID: \"732fc7b7-286b-451d-9298-6fa09463c374\") " pod="openshift-must-gather-vxxd9/must-gather-bzrnn" Mar 18 17:46:39.263518 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:39.263485 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxxd9/must-gather-bzrnn" Mar 18 17:46:39.378114 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:39.378074 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vxxd9/must-gather-bzrnn"] Mar 18 17:46:39.381134 ip-10-0-137-219 kubenswrapper[2571]: W0318 17:46:39.381109 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod732fc7b7_286b_451d_9298_6fa09463c374.slice/crio-96b39638bad6a6e41735c63b70242312ab6a71e196607976aaff3b93d672110c WatchSource:0}: Error finding container 96b39638bad6a6e41735c63b70242312ab6a71e196607976aaff3b93d672110c: Status 404 returned error can't find the container with id 96b39638bad6a6e41735c63b70242312ab6a71e196607976aaff3b93d672110c Mar 18 17:46:39.382646 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:39.382630 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:46:40.255812 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:40.255779 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxxd9/must-gather-bzrnn" event={"ID":"732fc7b7-286b-451d-9298-6fa09463c374","Type":"ContainerStarted","Data":"96b39638bad6a6e41735c63b70242312ab6a71e196607976aaff3b93d672110c"} Mar 18 17:46:41.264791 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:41.264749 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxxd9/must-gather-bzrnn" event={"ID":"732fc7b7-286b-451d-9298-6fa09463c374","Type":"ContainerStarted","Data":"ba9e1cddeb56d58225a980268e53d3079c394e1e78989aa6ab40cfd27c0bed2b"} Mar 18 17:46:41.265258 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:41.264799 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxxd9/must-gather-bzrnn" event={"ID":"732fc7b7-286b-451d-9298-6fa09463c374","Type":"ContainerStarted","Data":"79a6c4f2a29934c76a9e4bb4c738431bebff799da006298842a1951067e8e088"} Mar 18 17:46:41.283100 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:41.283014 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vxxd9/must-gather-bzrnn" podStartSLOduration=1.995228981 podStartE2EDuration="3.282994715s" podCreationTimestamp="2026-03-18 17:46:38 +0000 UTC" firstStartedPulling="2026-03-18 17:46:39.382754629 +0000 UTC m=+3742.402414875" lastFinishedPulling="2026-03-18 17:46:40.670520364 +0000 UTC m=+3743.690180609" observedRunningTime="2026-03-18 17:46:41.282490599 +0000 UTC m=+3744.302150891" watchObservedRunningTime="2026-03-18 17:46:41.282994715 +0000 UTC m=+3744.302654983" Mar 18 17:46:42.318760 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:42.318727 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-l26p8_263447b9-97dc-43ab-bd1c-1fc05a93e62a/global-pull-secret-syncer/0.log" Mar 18 17:46:42.384086 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:42.384040 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4vbk6_c30f7041-e680-4daa-8027-9cc3928f1d8e/konnectivity-agent/0.log" Mar 18 17:46:42.509084 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:42.509051 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-219.ec2.internal_4e24708ab1b7fb0b335c2f977303c6cf/haproxy/0.log" Mar 18 17:46:46.341463 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:46.341432 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w2hjw_b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0/node-exporter/0.log" Mar 18 17:46:46.362314 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:46.362283 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w2hjw_b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0/kube-rbac-proxy/0.log" Mar 18 17:46:46.383410 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:46.383386 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w2hjw_b35dc19b-c62a-4812-aa2d-2bf6fe09a5b0/init-textfile/0.log" Mar 18 17:46:49.192668 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.192628 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx"] Mar 18 17:46:49.197095 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.197067 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.205714 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.205692 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx"] Mar 18 17:46:49.291812 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.291785 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t592\" (UniqueName: \"kubernetes.io/projected/2c2368b4-058e-4b74-87bb-a1394e3479ed-kube-api-access-2t592\") pod \"perf-node-gather-daemonset-djshx\" (UID: \"2c2368b4-058e-4b74-87bb-a1394e3479ed\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.292006 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.291840 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c2368b4-058e-4b74-87bb-a1394e3479ed-sys\") pod \"perf-node-gather-daemonset-djshx\" (UID: \"2c2368b4-058e-4b74-87bb-a1394e3479ed\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.292006 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.291888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2c2368b4-058e-4b74-87bb-a1394e3479ed-podres\") pod \"perf-node-gather-daemonset-djshx\" (UID: \"2c2368b4-058e-4b74-87bb-a1394e3479ed\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.292006 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.291935 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2c2368b4-058e-4b74-87bb-a1394e3479ed-proc\") pod \"perf-node-gather-daemonset-djshx\" (UID: \"2c2368b4-058e-4b74-87bb-a1394e3479ed\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.292006 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.291989 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c2368b4-058e-4b74-87bb-a1394e3479ed-lib-modules\") pod \"perf-node-gather-daemonset-djshx\" (UID: \"2c2368b4-058e-4b74-87bb-a1394e3479ed\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.392800 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.392767 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2t592\" (UniqueName: \"kubernetes.io/projected/2c2368b4-058e-4b74-87bb-a1394e3479ed-kube-api-access-2t592\") pod \"perf-node-gather-daemonset-djshx\" (UID: \"2c2368b4-058e-4b74-87bb-a1394e3479ed\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.393162 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.393141 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c2368b4-058e-4b74-87bb-a1394e3479ed-sys\") pod \"perf-node-gather-daemonset-djshx\" (UID: \"2c2368b4-058e-4b74-87bb-a1394e3479ed\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.393304 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.393185 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c2368b4-058e-4b74-87bb-a1394e3479ed-sys\") pod \"perf-node-gather-daemonset-djshx\" (UID: \"2c2368b4-058e-4b74-87bb-a1394e3479ed\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.393410 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.393336 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2c2368b4-058e-4b74-87bb-a1394e3479ed-podres\") pod \"perf-node-gather-daemonset-djshx\" (UID: \"2c2368b4-058e-4b74-87bb-a1394e3479ed\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.393410 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.393384 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2c2368b4-058e-4b74-87bb-a1394e3479ed-podres\") pod \"perf-node-gather-daemonset-djshx\" (UID: \"2c2368b4-058e-4b74-87bb-a1394e3479ed\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.393410 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.393387 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2c2368b4-058e-4b74-87bb-a1394e3479ed-proc\") pod \"perf-node-gather-daemonset-djshx\" (UID: \"2c2368b4-058e-4b74-87bb-a1394e3479ed\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.393531 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.393429 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2c2368b4-058e-4b74-87bb-a1394e3479ed-proc\") pod \"perf-node-gather-daemonset-djshx\" (UID: \"2c2368b4-058e-4b74-87bb-a1394e3479ed\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.393531 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.393448 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c2368b4-058e-4b74-87bb-a1394e3479ed-lib-modules\") pod \"perf-node-gather-daemonset-djshx\" (UID: \"2c2368b4-058e-4b74-87bb-a1394e3479ed\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.393531 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.393528 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c2368b4-058e-4b74-87bb-a1394e3479ed-lib-modules\") pod \"perf-node-gather-daemonset-djshx\" (UID: \"2c2368b4-058e-4b74-87bb-a1394e3479ed\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.403152 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.403113 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t592\" (UniqueName: \"kubernetes.io/projected/2c2368b4-058e-4b74-87bb-a1394e3479ed-kube-api-access-2t592\") pod \"perf-node-gather-daemonset-djshx\" (UID: \"2c2368b4-058e-4b74-87bb-a1394e3479ed\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.513383 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.513300 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:49.652721 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:49.652122 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx"] Mar 18 17:46:50.294472 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:50.294444 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" event={"ID":"2c2368b4-058e-4b74-87bb-a1394e3479ed","Type":"ContainerStarted","Data":"382fcc5a7180ddb5eb19e8efedccdd81c7b17640ba35aa77bcb19e5a79e6f496"} Mar 18 17:46:50.294472 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:50.294478 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" event={"ID":"2c2368b4-058e-4b74-87bb-a1394e3479ed","Type":"ContainerStarted","Data":"c8dfc95b779e490cfe9dc62ec79f0d40a7194ae179a518f8d2617b7491c6a26a"} Mar 18 17:46:50.294911 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:50.294608 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:46:50.314008 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:50.313960 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" podStartSLOduration=1.313943466 podStartE2EDuration="1.313943466s" podCreationTimestamp="2026-03-18 17:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:46:50.312449893 +0000 UTC m=+3753.332110159" watchObservedRunningTime="2026-03-18 17:46:50.313943466 +0000 UTC m=+3753.333603732" Mar 18 17:46:50.339987 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:50.339950 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-png9k_19740704-29b3-447d-9452-ddda50ca838f/dns/0.log" Mar 18 17:46:50.364914 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:50.364894 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-png9k_19740704-29b3-447d-9452-ddda50ca838f/kube-rbac-proxy/0.log" Mar 18 17:46:50.497103 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:50.497077 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zpjht_a6ec8ca6-7754-4403-be16-a059d6c3184b/dns-node-resolver/0.log" Mar 18 17:46:51.048250 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:51.048224 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sv4zc_8ea30226-c756-46d3-b584-3613c9c03311/node-ca/0.log" Mar 18 17:46:52.234231 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:52.234201 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8dnhc_55064175-dfab-4504-98ca-526cf33504e7/serve-healthcheck-canary/0.log" Mar 18 17:46:52.720153 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:52.720107 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fqvd4_df81206e-e9a8-44a0-a891-d6f01e5e71d8/kube-rbac-proxy/0.log" Mar 18 17:46:52.745374 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:52.745350 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fqvd4_df81206e-e9a8-44a0-a891-d6f01e5e71d8/exporter/0.log" Mar 18 17:46:52.769520 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:52.769496 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fqvd4_df81206e-e9a8-44a0-a891-d6f01e5e71d8/extractor/0.log" Mar 18 17:46:54.858988 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:54.858955 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-69d7c9bbdc-wnx8j_6c069101-5600-4887-a294-751b45c6512d/manager/0.log" Mar 18 17:46:54.880140 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:54.880112 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-fk8zt_a9b61867-7be1-44e9-8de6-b40798d1055d/manager/0.log" Mar 18 17:46:55.273966 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:55.273894 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-748c497bc-5kn6w_85251014-67ea-4213-b89c-89a3afacfed5/seaweedfs/0.log" Mar 18 17:46:56.307487 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:46:56.307461 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-djshx" Mar 18 17:47:00.948933 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:00.948909 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5rc5k_943aaf2c-0467-48a6-ac3c-29da845f8444/kube-multus-additional-cni-plugins/0.log" Mar 18 17:47:00.977954 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:00.977872 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5rc5k_943aaf2c-0467-48a6-ac3c-29da845f8444/egress-router-binary-copy/0.log" Mar 18 17:47:01.001904 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:01.001877 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5rc5k_943aaf2c-0467-48a6-ac3c-29da845f8444/cni-plugins/0.log" Mar 18 17:47:01.029636 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:01.029607 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5rc5k_943aaf2c-0467-48a6-ac3c-29da845f8444/bond-cni-plugin/0.log" Mar 18 17:47:01.054366 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:01.054340 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5rc5k_943aaf2c-0467-48a6-ac3c-29da845f8444/routeoverride-cni/0.log" Mar 18 17:47:01.080959 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:01.080931 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5rc5k_943aaf2c-0467-48a6-ac3c-29da845f8444/whereabouts-cni-bincopy/0.log" Mar 18 17:47:01.104688 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:01.104655 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5rc5k_943aaf2c-0467-48a6-ac3c-29da845f8444/whereabouts-cni/0.log" Mar 18 17:47:01.656475 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:01.656449 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xn75z_c15c5fb8-0b4e-452f-8ef8-120891ac0340/kube-multus/0.log" Mar 18 17:47:01.752431 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:01.752393 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mqv2m_268f643d-a996-4e11-b14c-70264a1426da/network-metrics-daemon/0.log" Mar 18 17:47:01.800585 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:01.800559 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mqv2m_268f643d-a996-4e11-b14c-70264a1426da/kube-rbac-proxy/0.log" Mar 18 17:47:03.473839 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:03.473797 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-controller/0.log" Mar 18 17:47:03.497374 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:03.497344 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/0.log" Mar 18 17:47:03.514179 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:03.514144 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovn-acl-logging/1.log" Mar 18 17:47:03.537381 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:03.537344 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/kube-rbac-proxy-node/0.log" Mar 18 17:47:03.559106 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:03.559076 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/kube-rbac-proxy-ovn-metrics/0.log" Mar 18 17:47:03.579884 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:03.579852 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/northd/0.log" Mar 18 17:47:03.602044 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:03.602018 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/nbdb/0.log" Mar 18 17:47:03.625741 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:03.625708 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/sbdb/0.log" Mar 18 17:47:03.740352 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:03.740278 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxnmv_d16fe930-746d-4ac7-b7e6-09d6a8b75d28/ovnkube-controller/0.log" Mar 18 17:47:04.846561 ip-10-0-137-219 kubenswrapper[2571]: I0318 17:47:04.846529 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-n98cl_5b14876c-8cf1-4df9-a7ba-bc940ec2ca85/network-check-target-container/0.log"