Yes, machine learning is capable of learning on its own. Yes, a machine should use algorithms objectively to mine data, run analyses, and reach conclusions. However, an underground movement is exposing that some programmers for AI are intrinsically coding the algorithms to show bias. What sociological and psychological impacts does this kind of marginalization have?
- What do you mean? Why would I know how to cook Indian food? The month's topic might be a little bit controversial. But hey, that's the point of the series. To get you thinking about this stuff. A quick Google search will tell you that discrimination can be defined as the unjust or prejudicial treatment of different categories of people or things, especially on the grounds of race, age or sex. Maybe the word discrimination is a bit too strong for my message. But in the real world, that's exactly what it is.
In the world of machine-learning and AI however, it's better known as bias. Now the word bias has several meanings in this domain. It's basically the tendency of the machine to consistently learn the incorrect connections. It does this because it doesn't take all the major variables into account. So a machine-learning model with high bias won't be able to learn relationships between features effectively and hence would underfit on a dataset leading to low accuracy. The flip side of that is high variance, which leads to overfitting of the data with a very low precision.
That's not the kind of bias I'm talking about anyway. Here I'm talking about real world bias that comes from algorithms and machine-learning. But we see it in this world as discrimination. And now you're thinking, wait a minute, you're talking about machines. Machines aren't supposed to discriminate. They're supposed to make objective decisions. Decisions that are based on algorithms. Algorithms that are constructed by really smart teams of humans. And therein lies the problem. We expect machines to be this omniscient do-no-wrong decision makers.
However, they're programmed by quite possibly the most illogical and irrational species on the planet. That's us. Let's take a deeper dive and look at how small subtle biases in programming affect humans psychologically and sociologically in big ways.