k-Nearest Neighbors algorithms

Luckily, there are solutions for every problem above – for example, the Locality Sensitive Hashing for solving the expensive computation cost, Distance metric learning[3] for approximating the ‘correct’ distance metric, and Random projection[4], as well as other dimension reduction techniques, for the breaking the curse of dimensionality. These topics will be further exploited, in future blog posts.

Please check our other blogs related to Machine Learning.
Machine Learning Blog

References

[1] Cover and Hart (1967), Nearest neighbor pattern recognition, IEEE Transactions on Information Theory 13; p.21-27

[2] Beyer, K.; Goldstein, J.; Ramakrishnan, R.; Shaft, U. (1999), When is “Nearest Neighbor” Meaningful?, Proc. 7th International Conference on Database Theory – ICDT’99. LNCS. 1540: 217235

[3] Yang and Liu (2006), Distance Metric Learning: A Comprehensive Survey.

[4] Dimitris Achlioptas (2003), Database-friendly random projections: Johnson-Lindenstrauss with binary coins, Journay of Computer and System Sciences, 66(4):671687.

Hiring Data Scientist / Engineer

We are looking for Data Scientist and Engineer.
Please check our Career Page.

Data Science Project

Please check about experiences for Data Science Project

Vietnam AI / Data Science Lab

Vietnam AI Lab

Please also visit Vietnam AI Lab