Author Talks: How scientific thinking can help us tackle our toughest societal problems

In this edition of Author Talks, McKinsey Global Publishing’s Stephanie d’Arc Taylor speaks with physicist Saul Perlmutter and social psychologist Robert MacCoun about their new book Third Millennium Thinking: Creating Sense in a World of Nonsense (Hachette Book Group, March 2024), coauthored with philosopher John Campbell. In an era of bitter factionalism and intractable disagreement, these authors offer a way forward by suggesting incorporating scientific thinking into problem-solving debates. By embracing curious skepticism, rather than cynical pessimism, Perlmutter and MacCoun believe we can solve the thorniest social issues. An edited version of the conversation follows, and you can also watch the full video at the end of this page.

Why did you write this book?

Saul Perlmutter: The genesis for this book goes back ten years or so. It was clear that society didn’t have any real way of having rational, simple discussions together. I would watch a group of scientists discuss a problem or think through what to do about a decision over a lunch table discussion. The whole vocabulary of ideas that they would use for how to think about problems just looked very different from what I was seeing in society at large. I wondered, is there a way we can capture what scientists have learned from science research that would be useful for everybody?

First, I thought about it as a physics professor: “Let me try to figure out a course that would teach this from the point of view of a physicist.” I realized that that wasn’t quite good enough. If you go to a physics faculty meeting, it isn’t run any more rationally than any other faculty meeting. I realized we needed some other expertise as well. That’s when I invited both John Campbell and Rob MacCoun into the conversation. John is a philosopher, and Rob is a social psychologist.

Robert MacCoun headshot
Robert MacCoun
Robert MacCoun headshot

Robert MacCoun: John came at this as a philosopher. He was very interested in the question of what kind of authority scientists should have in a democratic society. Why, when scientists speak, should anyone give that extra weight compared to what other people say? Yet you also want to safeguard the freedoms of individuals to make their own decisions.

I came at it from a different perspective. I had been doing empirical research on emotional policy debates, things like recreational drug use and gay and lesbian people serving in the US military. I was trying to be an honest broker, saying, “What do we actually know about these debates?” I quickly realized that people were cherry-picking evidence for their side of the issue. I thought if we want to have evidence-based policy making, we need to do a better job of talking about the issues.

What surprised you in the research or writing process?

Robert MacCoun: We were surprised throughout the process. A lot of the surprises came from the fact that we come from different academic disciplines. We had to learn one another’s jargon.

Saul Perlmutter: One of the surprises we had is coming to grips with all the ways we fool ourselves and try to avoid fooling ourselves. Scientists have learned a form of skepticism—to watch out for the different ways we can personally go wrong. That’s very important.

But when you start to doubt whether you can ever trust any result, it’s easy to fall into a form of cynicism. That is, of course, not the lesson of science. Scientists have learned how to balance skepticism with an ability to recognize scientific processes and when you can trust something.

Saul Perlmutter headshot
Saul Perlmutter
Saul Perlmutter headshot

One of the things that we talk about is how to know which experts to trust. We think you should be looking for experts who are willing to show you the weaknesses of their argument, who are willing to change their mind when they learn something different. Those kinds of experts are the ones that you should be hunting out and relying on. If you see that kind of balanced, thoughtful process in a group that’s coming to a consensus about some idea, that’s a much more trustworthy consensus than a group that seems to be trying to convince you of something.

Why are scientific approaches to decision making well-suited to the challenges of our current era?

Saul Perlmutter: It’s difficult to make a clear decision if you’re scared. When people are scared, they tend to hunker down. They’re less able to look at the issues and figure out what needs to be done. We live in a time when the media, by design, tend to scare us. They find the most terrible things that are happening anywhere in the world and amplify them, because those are the things that grab people. That means that today we live in a time where people are frightened of one another in a way that makes it hard for them to make forward progress together.

Robert MacCoun: One thing that’s significant in our current day and age is that social media has revealed problems with how people approach argumentation. We’re all seeing how pathological it can get.

One thing we see in social media debates is pessimism combined with skepticism. This can lead to a cynical worldview, which is discouraging. Unfortunately, it’s easy to look clever by being cynical online. We also see some people who combine optimism with gullibility. They think things are great. They describe a world in which there are no problems. That’s not helpful either. A third possibility is people who are pessimistic but also gullible. That leads to conspiracy thinking. There’s a lot more conspiracy thinking than we realized.

What we’re calling third millennium thinking is cultivating skepticism rather than gullibility but also optimism rather than pessimism. The optimism is not so much a belief that things inexorably will get better. It’s optimism that if there’s a problem, there must be a solution out there and we’re determined to find it.

How can scientific approaches help us work better with people we don’t agree with?

Saul Perlmutter: Throughout history, the most dramatic scientific advances have been made by people who disagree with each other working together to sharpen each other’s thinking. Scientific culture understands this as almost a prerequisite to good thinking. That’s very different from a culture that finds disagreement scary, or where the goal is to disagree and win as opposed to disagree and figure it out.


Throughout history, the most dramatic scientific advances have been made by people who disagree with each other working together to sharpen each other’s thinking.

Saul Perlmutter

Robert MacCoun: The first half of our book focuses on what we call habits of mind: individual-level skills that scientists cultivate about how to think carefully about evidence. We also devote a lot of the book to habits of community, which are ways of working with people who you disagree with to try to find a common understanding, to try to come up with win–win solutions that take into account the concerns of both sides. These habits of community are as important as individual-level cognition. You need to create a community in which people understand that there’s going to be disagreement, but that disagreement does not have to be a blood sport. It can lead to progress. The book isn’t anticonflict by any means.

Saul Perlmutter: We realized that we as authors and scientists still fall into the traps we warn about in the book. The way that we usually catch ourselves is with the help of other scientists. We embed ourselves with people who know these same mental traps. They keep us honest. So when you go to work with other people you may disagree with, your goal probably shouldn’t be to convince them of something. Your goal should be to figure out where you’re going wrong and use your colleagues to help you catch your errors.

You need to create a community in which people understand that there’s going to be disagreement, but that disagreement does not have to be a blood sport. It can lead to progress.

Robert MacCoun

What role does bias play in decision making?

Robert MacCoun: The way our brains process information often tricks us into making mistakes. Really, the history of scientific methodology is a history of how to set up conditions in a study to minimize these biases.

Saul Perlmutter: These scientific approaches to avoiding biases have valid counterparts in our own day-to-day life activities. For example, you have the problem of confirmation bias—that’s the bias where you only pay attention to evidence that supports your existing point of view, and you’re critical of evidence that contradicts something you already believe.

To help us fight this bias, we could use the technique of blind analysis. Suppose you’re trying to figure out which websites to trust for medical advice. One way to do it is to read a bunch of websites until you find the ones that recommend something you were hoping they would recommend. But that clearly is just confirmation bias.

A blinded version of the same experiment would be to first establish which websites you trust before you look at what they recommend for your particular medical decision. Once you’ve done that, then allow yourself to see what they recommend. That way you avoid just getting the advice you wanted to hear, which is almost useless.

How can people be encouraged to adopt scientific approaches?

Robert MacCoun: Something we devote a lot of attention to in the book is the problem of overconfidence. We give lots of examples of how experts are routinely overconfident in their pronouncements. This is a problem for credibility. Scientists, on the other hand, tend to qualify what we say rather than talk in categorical terms. Qualifying our opinions is a hard habit to cultivate. Even scientists who are trained in this often fail to implement it.

I think everybody needs to be taught how to routinely hedge when we state opinions. In the book, we argue that people should put themselves on an opinion budget and limit the number of opinions they hold. If you encounter some brand-new topic in the news, there’s no reason you should have an opinion yet. It’s really premature. But a lot of times we lock ourselves into a position and it leads to bad-quality argumentation. In particular, it leads to arguing just to win an argument, rather than to find out what the truth is.

Saul Perlmutter: People get convinced by watching people [leaders] they respect lead by example. If they present their points of view with that kind of probabilistic statement, and are clear that they could be wrong, that becomes a culture that people want to aspire to.

In terms of the bitter factionalism that we can sometimes see in our society, we’re often most aware of the most difficult personalities in a debate. It’s often the most partisan actors who get the attention. But if you talk to a random person from a population, the story looks very different. My understanding is that with the majority of the population, if you bring them into a conversation in a healthy way, where they don’t feel they have to defend a position, you can get a much more useful conversation going.

What are the conditions needed for good decision making?

Robert MacCoun: One of the things we know about small-group decision making is that when everybody already agrees in a group it goes smoothly. If people disagree, there’s more friction. So it’s tempting to create groups where everybody agrees, so we can avoid the friction. But the reality is, if we want to improve our thinking and find better solutions, we need to construct groups where there is disagreement. If you can’t do that, you need to assign someone to play the role of devil’s advocate. Nobody necessarily wants to play that role. But sometimes, to hone better decision making, you need to make sure other points of view are also seriously considered.

Saul Perlmutter: There are techniques to coax out interesting deliberation. We were optimistic based on the results from a technique called deliberative polling. You draw a random sample of a population to have a deliberation. You also make sure that that deliberative group is given a chance to ask questions of a range of experts who can tell you what the evidence is. Groups of people are pretty good at judging what to make of evidence and expert opinion. They don’t always get it perfect, but they’re remarkably capable of balancing the evidence. Then their points of view often change.

A number of years back, the Texas Public Utility Commission used deliberative polling to choose which sources of energy to invest in. They ran independent deliberation events, and they found that people really thought through the problems and independently came to similar conclusions. It brought them to a different mix of energy solutions than they otherwise might have chosen.

What makes you hopeful for the future?

Robert MacCoun: One thing we try to bring out in the book is that even though we’re not currently cooperating very well, we actually know how to do better. People have cooperated throughout history in amazing ways, and we can do it again. The techniques that seem to help us do it can be taught. It’s easy to get pessimistic these days. It seems like there’s problems everywhere you look. But, ultimately, we come away with a sense that we can do better.

There are a lot of reasons to think we can pull ourselves together and work through the daunting technological problems we face with artificial intelligence, climate change, and more. We think we can pull through this. There are ways people can creatively tackle these problems.

Saul Perlmutter: These scientific techniques can make a big difference. Even if a small fraction of the world was trying to engage and figure these things out in this way, I would not feel worried about most of the big problems of the world.

Watch the full interview

Author Talks

Visit Author Talks to see the full series.

Explore a career with us