Using Optuna to Optimize PyTorch Lightning Hyperparameters

From: https://medium.com/optuna/using-optuna-to-optimize-pytorch-lightning-hyperparameters-d9e04a481585 This post uses pytorch-lightning v0.6.0 (PyTorch v1.3.1)and optuna v1.1.0. PyTorch Lightning + Optuna! Optuna is a hyperparameter optimization framework applicable to machine learning frameworks and black-box optimization solvers. PyTorch Lightning provides a lightweight PyTorch wrapper for better scaling with less code. Combining the two of them allows for automatic tuning of hyperparameters to find the … Continue reading Using Optuna to Optimize PyTorch Lightning Hyperparameters

PyTorch layer dimensions: what size and why?

From: https://towardsdatascience.com/pytorch-layer-dimensions-what-sizes-should-they-be-and-why-4265a41e01fd Preface This article covers defining tensors, and properly initializing neural network layers in PyTorch, and more! Introduction You might be asking: “How do I initialize my layer dimensions in PyTorch without getting yelled at?” Is it all just trial and error? No, really… What are they supposed to be? For starters, did you … Continue reading PyTorch layer dimensions: what size and why?

How Data Augmentation Improves your CNN performance? — An Experiment in PyTorch and Torchvision

From: https://medium.com/swlh/how-data-augmentation-improves-your-cnn-performance-an-experiment-in-pytorch-and-torchvision-e5fb36d038fb Simple ways to boost your network performance Credits: https://amanispas.co.za/wp-content/uploads/2019/12/AdobeStock_241822083-Web-Crop-1360x680.jpg In simple terms, Data Augmentation is simply creating fake data. You use the data in the existing train set to create variations of it. This does two things — Increases the size of your training setRegularizes your network The book Deep Learningdefines regularization as any method … Continue reading How Data Augmentation Improves your CNN performance? — An Experiment in PyTorch and Torchvision