Vivek covers the scope of the course: what is the issue and who is it for? This course is about making intentional choices about the technology we develop and use.
- You know it's very difficult to know what side you're on. Take the example of what Facebook just went through with fake news that it's widely believed that fake news was used to influence the election. Whether you're on the left or the right over here, it doesn't matter. The fact is we've all been inundated with negative news on social media. Now the developers of these algorithms, the developers of this software, were they aware that their tool would be used to influence the elections and to change the outcome of humanity? I don't think so, but when they became aware of it, all of them now started discussing, debating it.
Now Facebook has a choice. It can limit its technology, it can put checks and balances in there, but then it's creating censorship. What does a company do? No one really knows but these are the type of dilemmas we're going to be facing over and over again. And all I can advise you on this is to learn the impact of your technology, that some of the developers of these technologies were aware of the power that they had, or they should have been aware of the power that they have. Mark Zuckerberg should have been aware that his amazingly powerful social media platform could be used to influence people.
He's had studies telling him this for many years now, the fact that they do influence people's thinking, they do influence voting patterns and so on. So these are the type of ethical debates we need to have when we create any technology. We may not get the answers right, but at least we've given it the best thought we can and tried to put the checks and balances into our technologies. This is our responsibility. We can't say it's someone else fault. We have to make the choices as we go. We have to answer difficult questions. Let's talk about self-driving cars. Imagine if you're one of the creators of the software.
You know that your technology is going to make life easier for literally billions of people. It's going to make transportation available to everyone who doesn't have it. It's going to make it affordable. It's going to reduce highway deaths. But you do have to code some decisions in there about the so-called trolley problem, what happens when you have to choose between a busload of children and another car? I mean this is a very unique circumstance, but we still have to think about the implications of that.
But more realistically, think about the impact the self-driving cars will have. You know those hundreds of thousands of Uber drivers worldwide, the millions of truck drivers worldwide, well, guess what, they're going to lose their jobs, and they're going to lose their jobs because of you, because of you that created this technology. Are you responsible for this? Well, if your boss tells you to do it, you're going to do it, but the fact is you have to think about the implications of it. You have to be aware of the fact that your technology is going to impact the livelihoods of millions, possibly billions. On the one hand, many people will benefit from it.
The fact that blind people will now be able to have transportation, mothers will be able to send their children to school and not have to worry about them reaching safely. We will all now be able to afford to get from point A to point B. This is all the good. On the other hand, you're going to create unemployment. Now you can step back and say this isn't my choice, this isn't my fault, but the fact is you can't. You are responsible for your creations. As consumers, we are responsible for picking Uber. Uber self-driving versus Uber over human.
We can decide that we're now going to support Uber in this and we will not take the self-driving option. As producers of this technology, we can put checks and balances into technologies to make sure that they're used in good ways. But we're going to have this choice over and over again because technology has become so powerful. It allows us to do amazing things and scary things at the same time. This is going to happen in practically every field there is because it's not just self-driving cars and artificial intelligence that are going to be impacted, it is everything from agriculture to finance to retail, transport, everything is going to be impacted by advancing technologies and we're going to face the same moral dilemmas over and over again.
So we need to start learning about them. We need to be aware of the impact our technologies can have and use them in the best possible way. So let me throw a scenario at you that now we can edit the human genome. Now if you, whether you're a creator or whether you're a consumer, let's say that you were about to have a child, or your wife was about to have a child, your spouse was about to have a child, and you learned that that child is going to have a debilitating disease, that that child is going to grow up handicapped and suffer for the rest of their lives, you had the choice of editing that child's genome to snip out that bad gene so the child will live a perfectly healthy life, how many of you would say we should do it? The vast majority would.
Even people who believe that we are God's creation, that we shouldn't be making these choices, they would sit back and say, "What right do I have "to make my child suffer for his or her entire life?" The vast majority of people would say we should do it. So that's an easy decision to make. Now let me go to the next step, that while you're taking that pill, you can add a couple of extra IQ points to it. You can add 20 extra IQ points to it. You can make the child healthier, stronger. You can make them now have blue eyes versus black eyes.
How many of us would agree to that? That's a very difficult choice because now you do get into the moral issues of do we have the right to create super human beings? Do we have the right to select the attributes our child should have? There are difficult choices here but it's the same technology I'm talking about. It's the same decision we have to make because if you're now taking that capsule, if you're now editing the DNA, you might as well edit a couple more. You're in a Word document and you're now editing two or three sentences. You see a couple of other things you can fix, why not fix those as well? Very difficult choices, but this is what technology makes possible right now, and it's going to be up to us to make these choices as consumers, as producers, and for God what's right or wrong.
Vivek Wadhwa offers an approach to help you make intentional choices about the technology you develop and the options you use when faced with uncertainty. He explains how to assess your efforts and deliver outcomes that are aligned with your values and the values of your company. Vivek goes beyond the usual discussion of "is this profitable" to "is this something we should do". Discover how to consider the implications of your actions and choices, weigh your options, and ultimately make more informed and mindful decisions.
- Asking tough questions
- Weighing risks and rewards
- Considering who benefits
- Choosing the right technology
- Determining how to do what is right