In this video, learn how to apply the third question.
- You know, this issue about autonomy versus dependence, I sometimes feel like a hypocrite over here, because I'm dependent on my smartphone. I carry it with me everywhere I go. I'm checking email a hundred times a day. When I get into my car, I switch on Google Maps, and I let Google Maps override my better judgment. I know that I should be taking a right over there, but Google Maps says go straight, and I blindly follow Google Maps thinking that it must have some information I don't, and I follow it. I've become dependent on this device. You know there are studies that say that the majority of people are afraid of self-driving cars, that they wouldn't want to be in one.
Frankly, I was like until I got into one. It took me 10 minutes to become comfortable with it. You know, you might be fearful about becoming dependent on technology, but I tell you, you already are dependent on technology. What would happen if you lost the internet for a day? You would start having withdraw symptoms. You would feel left out. Imagine now if we didn't have TV. Imagine if we didn't have cars, cars themselves. We're dependent on petroleum. We're dependent on this method of transportation. We live in suburbs. We couldn't get to the city, we couldn't get food if we didn't have cars.
We wear eyeglasses. We're dependent on eyeglasses to give us vision, right? So we already are dependent on technology. The question is, how far do we want to push it? How much more dependence do we want to have on digital tutors, on AI-based physicians, on self-driving cars, on robots to serve us, because we will also have robots that serve us before you know it. You remember Rosie from The Jetsons? By my estimate, Rosie will be practical in about a decade from now. We will have robots that talk to us, that serve us our food, that harass us, that make fun of us.
The question is, where do we draw the line? How much is too much? I don't know the answers to these things. In fact, this is what the theme of my book is, that all I'm doing is sharing information with you, and you have to learn how to make the choices, because all of us will have a say in this. All of us will have a choice whether we want our morning coffee delivered by Starbuck's drone, or whether we want to drive to the coffee shop ourselves. These are the choices we're going to have to make ourselves. And if the majority of us decide that we will support a technology, it'll become popular, it'll become the norm.
If a majority of us decide not to do it, they won't happen. The majority of us decide that we don't want Uber self-driving cars taking us out because we're putting the drivers out of jobs, Uber will not implement them. It'll keep the human drivers at the wheel. But we will be facing a choice with every technology, and we have to become aware of it. We have to start learning, and we have to take responsibility. We have to become cognizant of what's happening. We can't plead ignorance, and go back into our cubby holes and say there's nothing to do with me, it's not my fault, it's not my problem.
It is your problem, it is your fault, it is your choice that you have to make.
Vivek Wadhwa offers an approach to help you make intentional choices about the technology you develop and the options you use when faced with uncertainty. He explains how to assess your efforts and deliver outcomes that are aligned with your values and the values of your company. Vivek goes beyond the usual discussion of "is this profitable" to "is this something we should do". Discover how to consider the implications of your actions and choices, weigh your options, and ultimately make more informed and mindful decisions.
- Asking tough questions
- Weighing risks and rewards
- Considering who benefits
- Choosing the right technology
- Determining how to do what is right