Full-text resources of CEJSH and other databases are now available in the new Library of Science.
Visit https://bibliotekanauki.pl

PL EN


2016 | 3 (53) | 32-41

Article title

Propozycja agregowanego klasyfikatora kNN z selekcją zmiennych

Authors

Content

Title variants

EN
The proposition of the kNN ensemble with feature selection.

Languages of publication

PL

Abstracts

EN
Aggregated classification trees have gained recognition due to improved stability, and frequently reduced bias. However, the adaptation of this approach to the k nearest neighbors method (kNN), faces some difficulties: the relatively high stability of these classifiers, and an increase of misclassifications when the variables without discrimination power are present in the training set. In this paper we propose aggregated kNN classifier with feature selection. Its classification accuracy has been verified on the real data with added irrelevant variables.

Contributors

author

References

  • Bay S.D., 1999, Nearest neighbour classification from multiple feature subsets, Intelligent Data Analysis, 3(3), s. 191-209.
  • Breiman L., 1996, Bagging predictors, Machine Learning, 24(2), s. 123-140.
  • Breiman L., 2001, Random forests, Machine Learning, 45, s. 5-32.
  • Dietterich T.G., 2000, Ensemble methods in machine learning, [w:] Multiple Classifier Systems. First International Workshop, vol. 1857, Springer-Verlag.
  • Domeniconi C., Yan B., 2004, Nearest neighbour ensemble, IEEE Proceedings of the 17th International Conference on Pattern Recognition (ICPR 2004), vol. 1.
  • Enas G.G., Choi S.C., 1986, Choice of the smoothing parameter and efficiency of k-nearest neighbor classification, Computer and Mathematics with Applications, 12A(2), s. 235-244.
  • Frank A., Asuncion A., 2010, UCI Machine Learning Repository, Irvine, CA, University of California, School of Information and Computer Science, http://archive.ics.uci.edu/ml/.
  • Freund Y., Schapire R.E., 1997, A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting, Journal of Computer and System Sciences, no. 55, s. 119-139.
  • Friedman J.H., Popescu B.E., 2005, Predictive learning via rule ensembles, Technical Report, Department of Statistics, Stanford University.
  • Fumera G., Roli F., 2001, Error rejection in linearly combined multiple classifiers, Proceedings of International Workshop on Multiple Classifier Systems, Springer, Cambridge, UK.
  • Gatnar E., 2008, Podejście wielomodelowe w zagadnieniach dyskryminacji i regresji, Wydawnictwo Naukowe PWN, Warszawa.
  • Gul A., Perperoglou A., Khan Z., Mahmoud O., Miftahuddin M., Adler W., Lausen B., 2014, Ensemble of a subset of kNN classifiers, Advances in Data Analysis and Classification, s. 1-14.
  • Guyon I., Gunn S., Nikravesh M., Zadeh L., 2006, Feature Extraction: Foundations and Applications, Springer, New York.
  • Hechenbichler K., Schliep K.P., 2004, Weighted k-Nearest-Neighbour Techniques and Ordinal Classification, Discussion Paper 399, SFB 386, Ludwig-Maximilians Universität München.
  • Hellwig Z., 1969, Problem optymalnego wyboru predykant, Przegląd Statystyczny, nr 3-4, s. 221-237.
  • King R.D., Feng C., Sutherland A., 1995, StatLog: Comparison of classification algorithms on large real-world problems, Applied Artificial Intelligence, vol. 9, no. 3, s. 289-333.
  • Kira K., Rendell L.A., 1992, The feature selection problem: Traditional methods and a new algorithm, Proceedings AAAI-92, MIT Press.
  • Kononenko I., 1994, Estimating attributes: Analysis and extensions of RELIEF, Proceedings European Conference on Machine Learning.
  • Kubus M., 2015, Feature selection and the chessboard problem, Acta Universitatis Lodziensis, Folia Oeconomica, Statistical Analysis in Theory and Practice, no. 1 (311), s. 17-25.
  • Kubus M., 2016, Lokalna ocena mocy dyskryminacyjnej zmiennych, Prace Naukowe Uniwersytetu Ekonomicznego we Wrocławiu, nr 427, Taksonomia 27, Klasyfikacja i analiza danych – teoria i zastosowania, Wrocław.
  • Opitz D., Maclin R., 1999, Popular ensemble methods: An empirical study, Journal of Artificial Intelligence Research, 11, s. 169-198.
  • Zhou Z.H., Yu Y., 2005, Adapt bagging to nearest neighbour classifiers, Journal of Computer Science and Technology, vol. 20(1), s. 48-54.

Document Type

Publication order reference

Identifiers

YADDA identifier

bwmeta1.element.desklight-d3371341-b17e-4429-89b8-314c2685adfb
JavaScript is turned off in your web browser. Turn it on to take full advantage of this site, then refresh the page.