From: https://towardsdatascience.com/the-fascinating-no-gradient-approach-to-neural-net-optimization-abb287f88c97 Gradient descent is one of the most important ideas in machine learning: given some cost function to minimize, the algorithm iteratively takes steps of the greatest downward slope, theoretically landing in a minima after a sufficient number of iterations. First discovered by Cauchy in 1847 but expanded upon in Haskell Curry for non-linear … Continue reading The Fascinating No-Gradient Approach to Neural Net Optimization
Category: Optimization
How to Develop Optimization Models in Python
From: https://towardsdatascience.com/how-to-develop-optimization-models-in-python-1a03ef72f5b4 Determining how to design and operate a system in the best way, under the given circumstances such as allocation of scarce resources, usually requires leveraging on quantitative methods in decision making. Mathematical optimization is one of the main approaches for deciding the best action for a given situation. It consists of maximizing or minimizing the … Continue reading How to Develop Optimization Models in Python
Performance Optimization in R: Parallel Computing and Rcpp
From: https://tutorial.guidotti.dev/pa78y/ The ‘parallel’ package Reference: https://bookdown.org/rdpeng/rprogdatascience/parallel-computation.html Many computations in R can be made faster by the use of parallel computation. Generally, parallel computation is the simultaneous execution of different pieces of a larger computation across multiple computing processors or cores. The parallel package can be used to send tasks (encoded as function calls) to each of the … Continue reading Performance Optimization in R: Parallel Computing and Rcpp
Using Optuna to Optimize PyTorch Lightning Hyperparameters
From: https://medium.com/optuna/using-optuna-to-optimize-pytorch-lightning-hyperparameters-d9e04a481585 This post uses pytorch-lightning v0.6.0 (PyTorch v1.3.1)and optuna v1.1.0. PyTorch Lightning + Optuna! Optuna is a hyperparameter optimization framework applicable to machine learning frameworks and black-box optimization solvers. PyTorch Lightning provides a lightweight PyTorch wrapper for better scaling with less code. Combining the two of them allows for automatic tuning of hyperparameters to find the … Continue reading Using Optuna to Optimize PyTorch Lightning Hyperparameters



