Commonly, an algorithm needs a certain number of variables that control its behavior. The optimal values result in a better performance that could generate profits for companies, make the algorithm… Click to show full abstract
Commonly, an algorithm needs a certain number of variables that control its behavior. The optimal values result in a better performance that could generate profits for companies, make the algorithm stands out from similar applications or improve its ranking in algorithm competitions. However, finding this values is not straightforward because manual tuning could be a stressful and difficult task even for expert users. This paper presents, evaluates and compares 4 tools in the literature for hyper-parameter optimization, selected due to their number of citations, code availability and impact on literature: MCMC, SMAC, TPE and Spearmint. We evaluate these tools using the publicly available source code provided by the authors in a computer vision application: multiple object tracking (MOT), which is a challenging topic and the strategies rely on a set of tunable parameters. This analysis considers the impact of hyper-parameter optimization tools in terms of stability, performance, usability, among others. The evaluations are carried out using public benchmarks such as PETS09 and ETH and the results show how these tools change the performance of a MOT framework and how this would affect the results of real ranked competitions. Our goal is (1) to encourage the reader to use these tools and (2) to provide some guidelines that helps anyone when tuning his/her methods.
               
Click one of the above tabs to view related content.