Cracking open black box of automated machine learning

Interactive tool lets users see and control how automated model searches work.

Watch Video

Researchers from MIT and elsewhere have developed an interactive tool that, for the first time, lets users see and control how increasingly popular automated machine-learning (AutoML) systems work.

Researchers from MIT and elsewhere have developed an interactive tool that, for the first time, lets users see and control how increasingly popular automated machine-learning (AutoML) systems work.

Image: Chelsea Turner, MIT

Researchers from MIT and elsewhere have developed an interactive tool that, for the first time, lets users see and control how automated machine-learning systems work. The aim is to build confidence in these systems and find ways to improve them.

Designing a machine-learning model for a certain task - such as image classification, disease diagnoses, and stock market prediction - is an arduous, time-consuming process. Experts first choose from among many different algorithms to build the model around. Then, they manually tweak "hyperparameters" - which determine the model's overall structure - before the model starts training.

Recently developed automated machine-learning (AutoML) systems iteratively test and modify algorithms and those hyperparameters, and select the best-suited models. But the systems operate as "black boxes," meaning their selection techniques are hidden from users. Therefore, users may not trust the results and can find it difficult to tailor the systems to their search needs.

In a paper presented at the ACM CHI Conference on Human Factors in Computing Systems, researchers from MIT, the Hong Kong University of Science and Technology (HKUST), and Zhejiang University describe a tool that puts the analyses and control of AutoML methods into users' hands. Called ATMSeer, the tool takes as input an AutoML system, a dataset, and some information about a user's task. Then, it visualizes the search process in a user-friendly interface, which presents in-depth information on the models' performance.

/University Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.