From the course: Ethics and Law in Data Analytics

Data, individuals, and society

From the course: Ethics and Law in Data Analytics

Data, individuals, and society

- Data isn't just about the bits and bytes. It can be really personal, it can be about you and about you in society at large. And what Eva, Nathan, and I are going to talk about is the intersection of individuals data in society. So Nathan, could you tell us a little bit more about what this is going to entail? - Yeah, I think the conversation has to start around the topic of bias, 'cause bias obviously affects individuals, but it's also a societal question at the same time 'cause we have to ask ourselves what kind of society we want to be. And predictive algorithms, algorithms that predict your future behavior, they take large amounts of aggregated data about you that's actually been collected and then use statistical models to predict what's going to happen in the future. So as we now know from the lab in mod one, we're looking at recidivism, there are algorithms out there that predict are you likely to commit a crime again. Does that mean we should hold you in jail longer? So those type of algorithms are often sold as this is great 'cause we're going to remove the human bias from these situations. There's all these studies about how judges and juries et cetera are biased and so we can just get the bias out. But, I think the most fundamental thing to understand about this is that we haven't solved the problem of bias. That's the wrong way to think about it. We have a new problem of bias, which isn't to say it's a worse problem, it might turn out to be easier to solve than the human bias problem because that's pretty persistent problem. But we have to start thinking in terms of okay, now we got a new bias problem on our hands. And so that's a challenge we have to confront it. - Great, thank you. And Eva, how about you? - I think that this module's really going to help us identify the relationship between privacy and identity. So Nathan's talking about using this data, the data is largely private. It's personal to us, it identifies us. And there are zones of privacy that include questions of autonomy and self-determination, so we're going to be exploring the ways in which this bias is actually also taking away our right to fundamentally define ourselves. And this is a fundamental right in international law, human rights law, in the US Constitutional law, but recognizing there is a relationship between the two things and recognizing the importance of not turning over to data science, data analytics, and artificial intelligence our fundamental right to define ourselves and create our own lives. - Brilliant, thank you. Yeah, really important stuff and we look forward to digging in, thank you.

Contents