Sramana Mitra: Can I tell you a little bit about my interpretation of what’s happening in this realm?
Paul Daugherty: Sure.
Sramana Mitra: I think there’s going to be so much experimentation in this area in various applications of AI that it’s going to be completely unmanageable. I don’t think it’s going to be possible to regulate anything for a while.
Paul Daugherty: From a regulation perspective, I think it’s too early to regulate. That doesn’t mean it’s too early to control the implications of it. At the end of the day, we have the ability to control and decide what we want to implement. It’s being more clear about the decisions we want to make on using the technologies.
Sramana Mitra: Think about a story that 60 Minutes did on this guy on Cambridge Analytica. He did this model for Cambridge Analytica without really understanding or even thinking about the data. Our industry is full of very naive and very bright engineers who are going to tinker. They’re going to tinker with AI at a grand scale.
Right now, computer science programs are all shifting their emphasis on AI. There will be more and more people who are capable and equipped to tinker with AI. There will be enormous levels of AI experimentation that will go on. What that will produce is going to be very difficult to control.
Paul Daugherty: I don’t know. I’m not sure if I agree with that. Let me just finish on the Facebook – Cambridge Analytica thing. That was very controllable. The issue there is that it gets back to responsible AI. If Facebook didn’t do anything illegal but if they had a data policy that allowed overuse of their information, that was the root cause of the issue.
The fact that the researcher then did something with it is a secondary issue. The issue is, are we adequately controlling the use of customer’s data? That’s one of the fundamental questions that we believe every company should ask to control the access to information in the first place. In that case, I don’t think companies would have allowed that access to that information. That’s the way we can control this.
Every company has data like Facebook has. You have to think about controlling the spread of the data and how it’s being used. As a society, we can choose to police the way certain algorithms are being used. I’m not as pessimistic as you are on the way it spreads and is being used. I think we’re going to see far more productive good uses of AI to do the right things.
Sramana Mitra: I’m not in disagreement with you that there is a lot of good that will come out of it, but I think there will be a lot of unintended consequences. I don’t think there was malice, necessarily, on Facebook’s part or on the part of the guy who did Cambridge Analytica coding, but how that story played out into quite a sinister situation, that kind of unintended consequences is going to be rampant.
Paul Daugherty: We’re going to have more of those. We have to figure out how to prevent them and detect them in the right way. The other thing is, we’re going to see a shift in a lot of the business models too. We’re doing a lot of research on is the issue of trust and the role it will play in the future.
Through all the examples that we’re discussed, we have seen that AI is allowing the creation of more personalized services that are more intrinsic and at the heart of what we, as humans, are doing. What that means is companies are only going to be successful if they retain our trust and our willingness to continue to interact with them. Amazon is a good example.
We trust Amazon a lot – even the way that we deal with us. Many customers will go so far as to give Amazon digital access to their home to drop the products inside the door. That’s a high level of trust. New Blockchain-enabled identity architectures enable protection of the data and control access in the right way. It might allow this to scale rapidly and create more business models where trust and control of the information is at the center of the business.
It’ll be less common to be able to create these knock-on effects of people misusing data. That’s the world we’re moving into. We’re starting to see that work its way to some business models already.