Fashion, Trees, and Convolutions: Part III - Convolutions

March 10, 2020

In this mini-series of posts, I will describe a hyper-parameter tuning experiment on the fashion-mnist dataset. I wanted to test out a guided and easy way to run hyper-parameter tuning. In Part II, I described setting up the end-to-end pipeline with a baseline, and running hyper-parameter tuning with the hyperopt package. In this third and final chapter, I describe my target models, a convolutional neural network trained from scratch and a transfer learning model. ... Read more

Fashion, Trees, and Convolutions: Part II - Baseline

March 8, 2020

In this mini-series of posts, I will describe a hyper-parameter tuning experiment on the fashion-mnist dataset. In the Part I, I described the workflow to create the data for my experiments. In this post, I describe creating the baseline and a guided hyper-parameter tuning method. The Baseline For any modeling tasks, I always like to create a baseline model as a starting point. Typically, this will be a relatively basic model in nature. ... Read more

Fashion, Trees, and Convolutions: Part I - Data Crunch

March 6, 2020

In this mini-series of posts, I will describe a hyper-parameter tuning experiment on the fashion-mnist dataset. Hyper-paramater tuning is a very important aspect of training models in machine learning. Particularly with neural networks, where the architecture, optimizer and data can be subject to different parameters. When developing machine learning solutions, there is an interative cycle that can be adopted to enable fast iteration, continous targeted improvements, and testing of the solution - as with any other software systems problems. ... Read more

Distill: Clearing Research Debt

April 8, 2017

Recently, I was refered to a website called Distill, by a friend. Their purpose for existing, as they claim, is to reduce research debt. A phenomenon that occurs when there is so much work that has been done before in a field, and is hidden behind complicated explanations that could be rather simplified, thus slowing down scientific progress by demanding a high amount of energy from researchers in understanding previous work. ... Read more