A Major Challenge in Machine Learning Applications: Interpretability

PropTech@ecyY
6 min readNov 9, 2023

Misconceptions Surrounding Rule Generation

A common misconception exists regarding the functioning of machine learning, particularly the belief that it “produces rules” based on data and answers. It is crucial to clarify that machine learning does not generate explicit, easily interpretable rules. Instead, it learns intricate patterns and associations within the data to facilitate predictions or generate answers. These patterns, often perceived as rules, might be challenging for humans to decipher, highlighting one of the fundamental challenges in machine learning. Machine learning is frequently perceived as a “black box,” providing answers without revealing its underlying mechanisms. This opacity can pose difficulties when attempting to understand why the answers provided are not as expected.

Training and Testing in a Black-box Model

The diagram aptly illustrates the training process in machine learning. During training, the machine learning algorithm refines its internal parameters (often likened to rules) to minimize the disparities between its predictions and the provided correct answers. Unlike traditional programming, which involves the explicit provision of rules by a programmer, both traditional programming and…

--

--