(upbeat electronic music) - The first part is just getting a team together and what we call red teaming. That's a team that says, what can go wrong? But there's even one step that you should do, that's once you've got the algorithm. What step you should have done before that is you should be building the algorithm in that technology with the people who it's going to impact the most. They should be at the table with you as you're building it, not some proxy for them, not just some user research. They should be building it with you. Don't build it just with them in mind. Build it with them at the table. And it changes your entire paradigm as you approach these things, when you do that, in radical ways. The final part of this is what we're still learning. We don't know actually how to test algorithms. We don't know how to compare them from one type to another yet, from a black box perspective. If you're building a self-driving car, and you're testing it only in Palo Alto, is a black person going to be in the data set? What about somebody in a wheelchair? The key tenant there is how do we start having this dialogue and discussion on ethics in data? And there's kind of a couple things that need to first happen, and it starts with talent and then goes to process. The first thing is, an easy thing to do is, do you have a checklist as you go through and actually launch a product or use this? And some of the questions on the checklist that we talk about is, who's going to maintain the code? Who's going to maintain the algorithm? Have you built it with people who are going to be impacted? Have we done a red teaming test to see what might go wrong and just test impression? Have we gone to possibly external groups who might have input that we don't have? What happens if we find there's a problem? Can we just shut it off? Do we have to iterate? What do we do? And so on and so on. And a lightweight checklist just to say like do we have these things? Then there's a set of classic things that actually come up, which is, is there clarity around what this algorithm is doing? And that does a person know that this algorithm is doing this? Is there consent? Is there control? And you can go on and we have this framework which we call the five Cs that just allow you to walk through that as you're building this to ask those specific things. And it starts with that, but that is necessary but not sufficient, because we have so much ahead of us that we have to learn as these things get developed and implemented and put into real systems out there. (upbeat electronic music)
Skill Level Intermediate
- Mark as unwatched
- Mark all as unwatched
Are you sure you want to mark all the videos in this course as unwatched?
This will not affect your course history, your reports, or your certificates of completion for this course.Cancel
Take notes with your new membership!
Type in the entry box, then click Enter to save your note.
1:30Press on any video thumbnail to jump immediately to the timecode shown.
Notes are saved with you account but can also be exported as plain text, MS Word, PDF, Google Doc, or Evernote.