LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Cosine K-Nearest Neighbor in Milkfish Eye Classification

Photo by ytcreator from unsplash

K-Nearest Neighbors (K-NN) classification method gains refined version proposed by the researcher. The refinement aims to solve noise sensitive when using small K, and irrelevant class as classification result when… Click to show full abstract

K-Nearest Neighbors (K-NN) classification method gains refined version proposed by the researcher. The refinement aims to solve noise sensitive when using small K, and irrelevant class as classification result when using large K. The problem in the previous version of method was that the weights were calculated individually, so the result was not optimal. We propose recent weighting scheme where the weights were no longer gained from the nearest neighbor individually, but by involving all pair of the nearest neighbor, called Cosine K-NN (CosKNN). We also introduce a trigonometric map to describe the Cosine weight. CosKNN is soft value to represent ownership of each class to the testing data. Empirically, CosKNN is tested and compared with other K-NN refinement using milkfish eye, UCI, and KEEL dataset. The result shows that CosKNN hold superior performance compared to the other methods although K number is higher of which accuracy is 96.79%.

Keywords: nearest neighbor; cosine; classification; milkfish eye; neighbor

Journal Title: International Journal of Intelligent Engineering and Systems
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.