K-NN is the most friendly algorithm in Machine Learning. K-NN can be used for both classification and regression problems. Predicting labels of data based on nearby neighbors, this concept is called Inductive Learning. We select the k- nearby neighbors to make a classification decision

Mathematical Perspective for the definition of KNN

In the above equation, we can see that the predicted y label is the mode of its nearby neighbors. For example, a similar analogy is applied to K-NN, we have a set of numbers like {5,5,5,3,3,2} then the mode of the set is 5, so the new data point will be equal to 5.

To perform…


Data Science

This is my first post in Medium, I would like to share my learnings in this platform, to enhance the knowledge of the reader and at the same time mine. Because we only understand better when we explain it to others!! Here’s my journey towards Data Science.

It’s unbelievable that I’m currently pursuing Data Science. I have never expected that what I was trying to accomplish in my career is related to Data Science.

Since I started studying about Digital Signal Processing and Image Processing in my Engineering. I could relate it to my passion for my artistic skills like…

Sravani Subraveti

Hi!! I graduated Master’s from the University of Waterloo, an MEng in Electrical and Computer Engineering. Happiness is learning a new thing every day.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store