Applying feature-weighted gradient decent k-nearest neighbor to select promising projects for scientific funding

Chuqing Zhang, Jiangyuan Yao*, Guangwu Hu, Thomas Schøtt

*Kontaktforfatter for dette arbejde

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningpeer review

27 Downloads (Pure)

Abstrakt

Due to its outstanding ability in processing large quantity and high-dimensional data, machine learning models have been used in many cases, such as pattern recognition, classification, spam filtering, data mining and forecasting. As an outstanding machine learning algorithm, K-Nearest Neighbor (KNN) has been widely used in different situations, yet in selecting qualified applicants for winning a funding is almost new. The major problem lies in how to accurately determine the importance of attributes. In this paper, we propose a Feature-weighted Gradient Decent K-Nearest Neighbor (FGDKNN) method to classify funding applicants in to two types: approved ones or not approved ones. The FGDKNN is based on a gradient decent learning algorithm to update weight. It updates the weight of labels by minimizing error ratio iteratively, so that the importance of attributes can be described better. We investigate the performance of FGDKNN with Beijing Innofund. The results show that FGDKNN performs about 23%, 20%, 18%, 15% better than KNN, SVM, DT and ANN, respectively. Moreover, the FGDKNN has fast convergence time under different training scales, and has good performance under different settings.

OriginalsprogEngelsk
TidsskriftComputers, Materials & Continua
Vol/bind64
Udgave nummer3
Sider (fra-til)1741-1753
ISSN1546-2218
DOI
StatusUdgivet - 30. jun. 2020

Fingeraftryk

Dyk ned i forskningsemnerne om 'Applying feature-weighted gradient decent k-nearest neighbor to select promising projects for scientific funding'. Sammen danner de et unikt fingeraftryk.

Citationsformater