From the course: Ethics and Law in Data Analytics

Ethics in hiring with big data

From the course: Ethics and Law in Data Analytics

Ethics in hiring with big data

- If you've spent much time in organizations of any kind, you likely know how great it is to have a good boss and trustworthy coworkers. You've also probably had the opposite experience. And so you know that bad bosses or bad coworkers can make a work environment nearly intolerable. And you probably won't be surprised to learn that in studies, healthy work environments are positively associated with productivity. And vice versa for toxic work cultures. So it's really easy to appreciate why businesses are so concerned to hire the right candidate. This is the basic problem that keeps Human Resources managers up at night. There've always been major barriers to hiring the right person. First of all, it's really hard to tell from a resume, recommendation, and a 1/2 hour interview if the candidate in front of you will be good for your organization. This is the basic problem of measuring the qualitative. We know, both through research and basic everyday experience, what qualities are desirable in a worker. For example, emotional intelligence. But it's really hard to find a way to measure that. Then there's the problem of bias. In all societies there has been a significant bias against hiring women. And in the US there's also been a major hiring bias against people of color. And of course, the truth is that an equally distributed number of those victims of bias would've been great bosses and wonderful coworkers. So wouldn't it be just great if an algorithm existed that could find the ideal employee? And even better, this algorithm wouldn't be biased against women and people of color, right? It probably won't surprise you to learn that there are many such algorithms already in use. And there is now a full-blown discipline called People Analytics, and sometimes H.R. Analytics, standing for Human Resources. That studies how algorithms can be used to find those great candidates. The experts on People Analytics predict a complete change in how organizations find employees. Has People Analytics brought the revolution it promised? Well, there is evidence that these algorithms are doing a pretty good job so far. In the further reading section of the module, you'll find the link to a white paper published by McKenzie that details some of these successes. But remember, a theme of this course is that all revolutions have both winners and losers. In particular, there are two issues that should trouble us from an ethical perspective. And these issues are intention, such that the solution to one tends to butt up against the solution to the other. Machines learn from data. So People Analytics algorithms learn from whatever data we provide them. On the one side, the most obvious kind of relevant data about a potential worker is his or her employment history. But when algorithms read this data, all the longstanding human biases of the past are imported into the algorithm through the data. So the algorithm learns from past data regarding women and people of color without knowing the fact that the data was created in a context of bias. So far, we are not sure how to tell the algorithms about the context. So even though employment history data is extremely relevant, it is tainted by bias. One way to address that problem is to find data that does not have to do with employment history. If you remember from the lesson on Customer Relationship Management, this amounts to finding proxies. Bits of data from your digital trail that are not obviously related to your employment or work history, but are used as predictors of success in the workplace. Everything we were worried about then still applies. The overall worry being that for all we know, your digital trail may paint a badly inaccurate picture of you. For example, the algorithms might learn to avoid people with a six month gap in their employment history. But maybe at one point you took off six months to care for a parent or child. Or maybe because you have a highly specialized job, it took six months to find a new one. The algorithm wouldn't care about the context. It just sees the six month gap. Now, let me be clear. I have no idea if the six month employment gap is actually a proxy being used by People Analytics. But perhaps that is the point. We just have no idea what these proxies are. It would be a real harm to the organization, but more importantly, to you if you were excluded from a job or even an entire job market because of some irrelevant proxy.

Contents