- [Instructor] People make mistakes, a miscalculation here,…a subtle error there, it happens.…But there are bigger mistakes to worry about.…I'm talking about errors that proliferate through culture.…They build a misconception of reality, they skew the truth.…Let's take a look at a few examples of what I mean.…In 2016, researchers found…some really interesting information…while studying word-embedded algorithms.…These algorithms use natural language processing…to associate feelings and sentiments with words.…
What is natural language processing?…It's essentially how Google guesses what you're looking for…when you typing in a search term.…Anyway, back to the study.…They found that algorithms had problematic biases,…such as associating computer programmer with male pronouns…and homemaker or receptionist with female ones.…This is obviously quite disturbing…because their widespread use…often tends to amplify these biases.…If you're interested, the paper is called…Man Is to Computer Programmer as Woman Is to Homemaker?…
Skill Level Intermediate
Data Visualization: A Lesson and Listen Serieswith Bill Shander3h 2m Intermediate
Everyday Statistics, with Eddie Davilawith Eddie Davila2h 37m Intermediate
Data in our lives1m 50s
Episode One: Predictions
Episode Two: Discrimination
Episode Three: Ethical Considerations
4. Episode Four: The little chips that control you
Episode Five: The Quantified Self
Episode Six: The fork in the road
- Mark as unwatched
- Mark all as unwatched
Are you sure you want to mark all the videos in this course as unwatched?
This will not affect your course history, your reports, or your certificates of completion for this course.Cancel
Take notes with your new membership!
Type in the entry box, then click Enter to save your note.
1:30Press on any video thumbnail to jump immediately to the timecode shown.
Notes are saved with you account but can also be exported as plain text, MS Word, PDF, Google Doc, or Evernote.