Journal
2017 IEEE INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (DSAA)
Volume -, Issue -, Pages 1-10Publisher
IEEE
DOI: 10.1109/DSAA.2017.22
Keywords
Terms Classification; Statistical Learning Theory; Rademacher Complexity; VC Dimension; Nearest Neighbor; Quantization; Regularization; Empirical Risk Minimization
Categories
Funding
- NSF award [IIS-1247581]
- DARPA/Army award [W911NF-16-1-0553]
- DARPA/AFRL award [FA8750-17-2-0102]
Ask authors/readers for more resources
We define the k-Nearest Representatives (k-NR) classifier, a distance-based classifier similar to the k-nearest neighbors classifier with comparable accuracy in practice, and stronger generalization bounds. Uniform convergence is shown through Rademacher complexity, and generalizability is controlled through regularization. Finite-sample risk bound are also given. Compared to the k-NN, the k-NR requires less memory to store and classification queries may be made more efficiently. Training is also efficient, being polynomial in all parameters, and is accomplished via a simple empirical risk minimization process.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available