Accelerating TSNE with GPUs: From hours to seconds

From: https://medium.com/rapids-ai/tsne-with-gpus-hours-to-seconds-9d9c17c941db Figure 1. cuML TSNE on MNIST Fashion takes 3 seconds. Scikit-Learn takes 1 hour. TSNE (T-Distributed Stochastic Neighbor Embedding) is a popular unsupervised dimensionality reduction algorithm that finds uses as varied as neurology, image similarity, and visualizing neural networks. Unfortunately, its biggest drawback has been the long processing times in most available implementations. RAPIDS now provides fast GPU-accelerated TSNE, … Continue reading Accelerating TSNE with GPUs: From hours to seconds

Intuitively Understanding Convolutions for Deep Learning

From: https://towardsdatascience.com/intuitively-understanding-convolutions-for-deep-learning-1f6f42faee1 The advent of powerful and versatile deep learning frameworks in recent years has made it possible to implement convolution layers into a deep learning model an extremely simple task, often achievable in a single line of code. However, understanding convolutions, especially for the first time can often feel a bit unnerving, with terms … Continue reading Intuitively Understanding Convolutions for Deep Learning

Solving Differential Equations and Kolmogorov Equations using Deep Learning

From: https://medium.com/nieuwsgierigheid/solving-differential-equations-and-kolmogorov-equations-using-deep-learning-c39aed011a10 This work seeks to make Beck et. al, 2018 understandable and put their solutions in a strong background to prepare for understanding of Stochastic Differential Equations. What are Stochastic Differential Equations? You’ve seen Ordinary Differential Equations. They are not spooky! They look like so: Equation 1: An Ordinary Differential Equation What defines an ordinary differential equation is … Continue reading Solving Differential Equations and Kolmogorov Equations using Deep Learning

ML impossible: Train 1 billion samples in 5 minutes on your laptop using Vaex and Scikit-Learn

From: https://towardsdatascience.com/ml-impossible-train-a-1-billion-sample-model-in-20-minutes-with-vaex-and-scikit-learn-on-your-9e2968e6f385 “Data is the new oil.” Regardless of whether or not you agree with this statement, the race for gathering and exploiting data has been going on for a while now. In fact, one thing the tech giants of today have in common, is their capacity to fully exploit the enormous quantity of data … Continue reading ML impossible: Train 1 billion samples in 5 minutes on your laptop using Vaex and Scikit-Learn

Using TensorFlow 2.0 to Compose Music

From: https://www.datacamp.com/community/tutorials/using-tensorflow-to-compose-music This tutorial was developed around TensorFlow 2.0 in Python, along with the high-level Keras API, which plays an enhanced role in TensorFlow 2.0. For those who would like to learn more about TensorFlow 2.0, see Introduction to TensorFlow in Python on DataCamp. For an exhaustive review of the deep learning for music literature, see Briot, Hadjerest, … Continue reading Using TensorFlow 2.0 to Compose Music

Access the free economic database DBnomics with R

From: https://macro.cepremap.fr/article/2019-10/rdbnomics-tutorial/ DBnomics : the world’s economic database Explore all the economic data from different providers (national and international statistical institutes, central banks, etc.), for free, following the link db.nomics.world. You can also retrieve all the economic data through the rdbnomics package here. This blog post describes the different ways to do so. Fetch time series by ids First, let’s assume that … Continue reading Access the free economic database DBnomics with R

Implemetation of 17 classification algorithms in R using car evaluation data

From: https://www.datascience-zing.com/blog/implemetation-of-17-classification-algorithms-in-r-using-car-ev This data is obtained from UCI Machine learning repository. The purpose of the analysis is to evaluate  the safety standard of the cars based on certain parameters and classify them. The detailed description of the dataset is provided below as given in the website: For detailed code visit my Github repository  1. Title: Car Evaluation … Continue reading Implemetation of 17 classification algorithms in R using car evaluation data

Gradient Based Optimizations: Jacobians, Jababians & Hessians

Taylor Series to Constrained Optimization to Linear Least Squares From: https://medium.com/swlh/gradient-based-optimizations-jacobians-jababians-hessians-b7cbe62d662d Jacobian Sometimes we need to find all of the partial derivatives of a function whose input and output are both vectors. The matrix containing all such partial derivatives is the Jacobian. Given: The Jacobian matrix J is given by: Example of Jacobian Matrix Let’s say: The … Continue reading Gradient Based Optimizations: Jacobians, Jababians & Hessians

Variance, Attractors and Behavior of Chaotic Statistical Systems

From: https://www.datasciencecentral.com/profiles/blogs/chaos-attractors-in-machine-learning-systems We study the properties of a typical chaotic system to derive general insights that apply to a large class of unusual statistical distributions. The purpose is to create a unified theory of these systems. These systems can be deterministic or random, yet due to their gentle chaotic nature, they exhibit the same behavior … Continue reading Variance, Attractors and Behavior of Chaotic Statistical Systems

Noise: It’s not always annoying

From: https://towardsdatascience.com/noise-its-not-always-annoying-1bd5f0f240f One of the first concepts you learn when you begin to study neural networks is the meaning of overfitting and underfitting. Sometimes, it is a challenge to train a model that generalizes your data perfectly, especially when you have a small dataset because: When you train a neural network with small datasets, the network generally memorizes … Continue reading Noise: It’s not always annoying