Originally published on “Matters” by Designit
The pioneering interaction designer intends to do it with his tech-political movement, Ancestry Thinking, which focuses on making the world better for our children.
This is part three of an interview series with Alan Cooper, “The Father of Visual Basic,” and co-founder of the pioneering design firm Cooper, which joined the Designit and Wipro Digital family last October. Alan is the author of About Face: The Essentials of Interaction Design.
Designit: When you started Cooper, you wanted to fix the world. How do you feel about the future and what do you want to fix?
Alan: I’ve been an entrepreneur all my life and a huge believer in free enterprise and rewards for the work you do, and I think a lot of stuff has gotten out of hand. For me, the issue is the class of unscrupulous people who see things purely in terms of business success and who are perfectly willing to subvert and suborn our social institutions in order to amass money.
Designit: Why is this such a problem today?
Alan: Because the machinery of the information age is the fabric of our everyday society. Look how we handle electricity. We understand that electricity is a common good so we’re willing to let organizations that supply us with it make a modest profit off what they do because we want them to innovate — but we control it. For the opportunity to make a regulated capped profit, we demand that companies provide electricity to people in our country even in places where they don’t make money. Power is a common good. Power is not the motivation. The service to the community is the motivation.
Designit: What technologies and services should be regulated going forward?
Alan: If you have all the information in world, like Google, that’s like having all the electricity in the world, and it should be regulated for the common good. It shouldn’t be permissible for a few people to make enormous quantities of money off this common good while everyone else suffers. It’s the same with retail. It used to be distributed. Then the Big Boxes pushed the mom-and-pop shops out. And now Amazon, the biggest box of all, is taking over. Retail is becoming a common good, because everyone is buying everything from Amazon and Ali Baba. The same goes for social media, because it is now interwoven with the ways in which we communicate: What are Twitter, Facebook, and WhatsApp really providing? Once you inject a profit motive, it subverts the purpose of the service.
Designit: How does this tie into your current project about Ancestry Thinking?
Alan: Ancestry Thinking is about creating a better world for our children. In the same way that I created a taxonomy for making interaction design user-friendly, I want to create a taxonomy to handle bad technological and design behavior. First, we need to look at the assumptions we are making. Then we ask what assumptions are underpinning those assumptions? And so on. As you work your way out, you find that the assumptions controlling our world have been latent.
Designit: Can you give us an example?
Alan: Thirty years ago, you went to the corner store and the people knew you and knew what you liked. This is good business. But if you have the world’s largest retailer know every product that a person buys, and track them, then we’re talking about something qualitatively different. This is when you have to question the assumptions you’ve known all along. For example, it’s good to know your customer! No one is saying, hey let’s take another look at that. Now that you can aggregate all that data, so the idea that a retailer should “know” its customers has radically different implications.
Designit: In your Interaction18 keynote, you refer to the Manhattan Project and offer examples of how good work can be turned. For example, hackers infiltrate your chatbot with lies and prejudices and it starts spouting hateful venom. Or you analyze user data to give social media users the posts they prefer to see and Russian hackers use the psychological profile you created to hack the 2016 U.S. election. How can you help?
Alan: There are two things we are trying to do. First, how can you nip badly behaved technological products in the bud, and stop them before they become nasty? You do it by questioning your assumptions early in the process and continually during the development process. Second, you do it by looking at externalities.
Designit: Give us some examples of assumptions and externalities in companies today.
Alan: For example, Uber says: I’m enabling great opportunities for people to put their cars to use and make money. The externality is that you’re creating jobs that have no benefits for people in a country in which healthcare is not provided by the government. You say: not my problem! Walmart says we pay $7/hour. The externality is that people can’t afford to live on that. Not your problem! Another example is the second amendment assumption that you needed a musket to defend against British soldiers. But the fact that you can have radically awesome firepower at your disposal today means your assumptions behind the second amendment need to be examined.
Designit: How do you figure out whether stuff being designed today has a diabolical future?
Alan: The third part of what we’re doing is recognizing timescale. As we design solutions to problems, we have to figure out the time scale we are using. We design for today, but what is the life span of our products and services, and of our actions?
For example, Facebook was just fine when it was created for university students to swap pictures and say whether they hooked up. But look what Facebook is costing us, now that it has become so lawless and unwieldly. Murders and suicides take place live on it. Your profile gets stolen and is used to manipulate you. As I said in that keynote, we want a villain. But I’m not sure anyone is to blame for the abuse in the system. It is a systems problem. It is not inevitable that our technological achievements become agents of evil. What we need is to balance the imperatives of making money with ethics. We owe it to our ancestors to have an ethics constitution.
Designit: Speaking of ethics, what are the unintended consequences of Big Data and artificial intelligence? There has been a huge surge of interest and investment in these technologies.
Alan: It’s not about the unintended consequences of technology as much as it’s about the unintended consequences of business models — those based on making and consuming vast quantities of money. What we’re discovering is by using these tools of data aggregation, you can concentrate enormous quantities of money.
Now the problem, or the opportunity, is that in order to create these gigantic money pumps that pour millions into the pockets of dozens, you need practitioners — 3D designers, developers, and deployers. This is why you find giant, successful tech companies within a few miles of world-class universities. They want access to smart, young, ambitious people who want to make a name for themselves and to do really good work.
Designit: How do we turn this problem into an opportunity?
Alan: A lot of practitioners realize they’re being used. I’m interested in holding a mirror up to this situation and saying: “You’re a lot more powerful than you think, Mr. Practitioner, the system can’t work without you.” I want to tell them this: If you want to live in a fair, just, ethical, and equitable world, don’t wait for the people building the money pumps to create that world, because they’re not going to. If you refuse to build these money pumps, they won’t get built.
Part One: Alan Cooper on Designing the Future
Part Two: What is the ROI on Management?