期刊
INFORMATION SCIENCES
卷 364, 期 -, 页码 129-145出版社
ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2015.09.021
关键词
Random bases; Measure concentration; Neural networks; Approximation
资金
- Innovate UK Technology Strategy Board (Knowledge Transfer Partnership grant) [KTP009890]
- Russian Foundation for Basic Research [15-38-20178]
In this work we discuss the problem of selecting suitable approximators from families of parameterized elementary functions that are known to be dense in a Hilbert space of functions. We consider and analyze published procedures, both randomized and deterministic, for selecting elements from these families that have been shown to ensure the rate of convergence in la norm of order O(1/N), where N is the number of elements. We show that both randomized and deterministic procedures are successful if additional information about the families of functions to be approximated is provided. In the absence of such additional information one may observe exponential growth of the number of terms needed to approximate the function and/or extreme sensitivity of the outcome of the approximation to parameters. Implications of our analysis for applications of neural networks in modeling and control are illustrated with examples. (C) 2015 Elsevier Inc. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据