Month: March 2019

Random Forest Classifier Made Simple

This tutorial introduces you into an exciting machine learning technique: ensemble learning. Here’s my quick and dirty tip if your prediction accuracy sucks but you need to meet the deadline at all costs: try this “meta-learning” approach that combines the predictions (or classifications) of multiple machine learning algorithms. In many cases, it will give you …

Random Forest Classifier Made Simple Read More »

[One-Liner Tutorial] Support Vector Machines Made Simple

Support Vector Machines (SVM) have gained huge popularity in recent years. The reason is their robust classification performance – even in high-dimensional spaces: Surprisingly, SVMs even work if there are more dimensions (features) than data items. This is unusual for classification algorithms because of the curse of dimensionality – with increasing dimensionality, the data becomes …

[One-Liner Tutorial] Support Vector Machines Made Simple Read More »

[Tutorial] Neural Networks Made Easy — A Python One-Liner

Neural Networks have gained massive popularity in the last years. This is not only a result of the improved algorithms and learning techniques in the field but also of the accelerated hardware performance and the rise of General Processing GPU (GPGPU) technology. In this article, you’ll learn about the Multi-Layer Perceptron (MLP) which is one …

[Tutorial] Neural Networks Made Easy — A Python One-Liner Read More »

[Tutorial] How to Get the Row with Minimal Variance in One Line of Python Code

You may have read about the ‘V’s in Big Data: Volume, Velocity, Variety, Veracity, Value, Volatility. Variance is yet another important ‘V’ (it measures Volatility of a data set). In practice, variance is an important measure with important application domains in financial services, weather forecasting, and image processing. Variance measures how much the data spreads …

[Tutorial] How to Get the Row with Minimal Variance in One Line of Python Code Read More »

K-Nearest Neighbors as a Python One-Liner

The popular K-Nearest Neighbors Algorithm is used for regression and classification in many applications such as recommender systems, image classification, and financial data forecasting. It is the basis of many advanced machine learning techniques (e.g. in information retrieval). There is no doubt that understanding KNN is an important building block of your proficient computer science …

K-Nearest Neighbors as a Python One-Liner Read More »

[NumPy] How to Calculate Basic Statistics Along an Axis? (avg, var, std)

This article explains how to calculate basic statistics (average, standard deviation, and variance) along an axis. We use the NumPy library for linear algebra computations. These three ways are very similar — if you understand one of them, you’ll understand all of them. Graphical Explanation Here’s what you want to achieve: Extracting basic statistics from …

[NumPy] How to Calculate Basic Statistics Along an Axis? (avg, var, std) Read More »

Overloading Overwriting

Method Overriding vs Overloading in Python [+Video]

Method overloading: allowing different parameters for calling the same method Method overriding: overwriting the functionality of a method defined in a parent class. Here is an example for Method overloading: What’s the output of this code? The class Wizard defines an instance attribute ‘mana’ that represents the energy level of the respective Wizard instance. If …

Method Overriding vs Overloading in Python [+Video] Read More »