Data Bias in Machine Learning: Implications for Social Justice

Data Bias in Machine Learning: Implications for Social Justice
Take historically biased data, then add AI and ML to compound and exacerbate the problem.

Machine learning and artificial intelligence have taken organizations to new heights of innovation, growth, and profits thanks to their ability to analyze data efficiently and with extreme accuracy. However, the inherent nature of some algorithms such as black-box models have been proven, at times, to be unfair and lack transparency, leading to multiplicated bias and detrimental impact on minorities.


There are several key issues presented by black-box models, and they all work together to further bias data. The most prominent are models fed with data that is historically biased to begin with, and fed by humans who are biased by nature. In addition, because data analysts can only see the inputs and outputs but not the internal workings of how results are determined, machine learning is constantly aggregating this data, including personal data. But this process lacks transparency on how the data is being used and why. The lack of transparency means that data analysts have no clear view of inputs and outputs, and algorithms are making analyses and predictions about our work performance, economic situation, health, preferences, and more without providing insights into how it came up with its conclusion.


In the infosec realm, this is important as more security platforms and services increasingly rely on ML and AI for automation and superior performance. But if the underlying software and algorithms for these same products and services reflect biases, they'll simply perpetuate the prejudices and errant conclusions associated with race, gender, religion, physical abilities, appearance and other characteristics. This has implications for both information and physical security, as well as for personal privacy.


One of the most prominent examples of bias presented by these key issues ..

Support the originator by clicking the read the rest link below.