Enron, Worldcom, Bernie Madoff, the subprime mortgage crisis.
Over the past decade or so, news stories about unethical behavior have been a regular feature on TV, a long, discouraging parade of misdeeds marching across our screens. And in the face of these scandals, psychologists and economists have been slowly reworking how they think about the cause of unethical behavior.
In general, when we think about bad behavior, we think about it being tied to character: Bad people do bad things. But that model, researchers say, is profoundly inadequate.
Which brings us to the story of Toby Groves.
Toby grew up on a farm in Ohio. As a kid, the idea that he was a person of strong moral character was very important to him. Then one Sunday in 1986, when Toby was around 20, he went home for a visit with his family, and he had an experience that made the need to be good dramatically more pressing.
Twenty-two years after Toby made that promise to his father, he found himself standing in front of theexact same judge who had sentenced his brother, being sentenced for the exact same crime: fraud.
And not just any fraud — a massive bank fraud involving millions of dollars that drove several companies out of business and resulted in the loss of about a hundred jobs.
In 2008, Toby went to prison, where he says he spent two years staring at a ceiling, trying to understand what had happened.
Was he a bad character? Was it genetic? “Those were things that haunted me every second of every day,” Toby says. “I just couldn’t grasp it.”
This very basic question — what causes unethical behavior? — has been getting a fair amount of attention from researchers recently, particularly those interested in how our brains process information when we make decisions.
And what these these researchers have concluded is that most of us are capable of behaving in profoundly unethical ways. And not only are we capable of it — without realizing it, we do it all the time.
Consider the case of Toby Groves.
In the early 1990s, a couple of years after graduating from college, Toby decided to start his own mortgage loan company — and that promise to his father was on his mind.
So Toby decided to lie.
He told the bank that he was making $350,000, when in reality he was making nowhere near that.
This is the first lie Toby told — the unethical act that opened the door to all the other unethical acts. So, what was going on in his head at the time?
“There wasn’t much of a thought process,” he says. “I felt like, at that point, that was a small price to pay and almost like a cost of doing business. You know, things are going to happen, and I just needed to do whatever I needed to do to fix that. It wasn’t like … I didn’t think that I was going to be losing money forever or anything like that.”
Consider that for a moment.
Here is a man who stood with his heartbroken father and pledged to behave ethically. Anyone involved in the mortgage business knows that it is both unethical and illegal to lie on a mortgage application.
How could that promise be so easily broken?
To understand, says Ann Tenbrunsel, a researcher at Notre Dame who studies unethical behavior, you have to consider what this looks like from Toby’s perspective.
There is, she says, a common misperception that at moments like this, when people face an ethical decision, they clearly understand the choice that they are making.
“We assume that they can see the ethics and are consciously choosing not to behave ethically,” Tenbrunsel says.
This, generally speaking, is the basis of our disapproval: They knew. They chose to do wrong.
But Tenbrunsel says that we are frequently blind to the ethics of a situation.
Over the past couple of decades, psychologists have documented many different ways that our minds fail to see what is directly in front of us. They’ve come up with a concept called “bounded ethicality”: That’s the notion that cognitively, our ability to behave ethically is seriously limited, because we don’t always see the ethical big picture.
One small example: the way a decision is framed. “The way that a decision is presented to me,” says Tenbrunsel, “very much changes the way in which I view that decision, and then eventually, the decision it is that I reach.”
Essentially, Tenbrunsel argues, certain cognitive frames make us blind to the fact that we are confronting an ethical problem at all.
Tenbrunsel told us about a recent experiment that illustrates the problem. She got together two groups of people and told one to think about a business decision. The other group was instructed to think about an ethical decision. Those asked to consider a business decision generated one mental checklist; those asked to think of an ethical decision generated a different mental checklist.
Tenbrunsel next had her subjects do an unrelated task to distract them. Then she presented them with an opportunity to cheat.
Those cognitively primed to think about business behaved radically different from those who were not — no matter who they were, or what their moral upbringing had been.
“If you’re thinking about a business decision, you are significantly more likely to lie than if you were thinking from an ethical frame,” Tenbrunsel says.
According to Tenbrunsel, the business frame cognitively activates one set of goals — to be competent, to be successful; the ethics frame triggers other goals. And once you’re in, say, a business frame, you become really focused on meeting those goals, and other goals can completely fade from view.
Tenbrunsel listened to Toby’s story, and she argues that one way to understand Toby’s initial choice to lie on his loan application is to consider the cognitive frame he was using.
“His sole focus was on making the best business decision,” she says, which made him blind to the ethics.
Obviously we’ll never know what was actually going through Toby’s mind, and the point of raising this possibility is not to excuse Toby’s bad behavior, but simply to demonstrate in a small way the very uncomfortable argument that these researchers are making:
That people can be genuinely unaware that they’re making a profoundly unethical decision.
It’s not that they’re evil — it’s that they don’t see.
And if we want to attack fraud, we have to understand that a lot of fraud is unintentional.
Tenbrunsel’s argument that we are often blind to the ethical dimensions of a situation might explain part of Toby’s story, his first unethical act. But a bigger puzzle remains: How did Toby’s fraud spread? How did a lie on a mortgage application balloon into a $7 million fraud?
According to Toby, in the weeks after his initial lie, he discovered more losses at his company — huge losses. Toby had already mortgaged his house. He didn’t have any more money, but he needed to save his business.
The easiest way for him to cover the mounting losses, he reasoned, was to get more loans. So Toby decided to do something that is much harder to understand than lying on a mortgage application: He took out a series of entirely false loans — loans on houses that didn’t exist.
Creating false loans is not an easy process. You have to manufacture from thin air borrowers and homes and the paperwork to go with them.
Toby was CEO of his company, but this was outside of his skill set. He needed help — people on his staff who knew how loan documents should look and how to fake them.
And so, one by one, Toby says, he pulled employees into a room.
“Maybe that was the most shocking thing,” Toby says. “Everyone said, ‘OK, we’re in trouble, we need to solve this. I’ll help you. You know, I’ll try to have that for you tomorrow.’ ”
According to Toby, no one said no.
Most of the people who helped Toby would not talk to us because they didn’t want to expose themselves to legal repercussions.
Of the four people at his company Toby told us about, we were able to speak about the fraud with only one — a woman on staff named Monique McDowell. She was involved in fabricating documents, and her description of what happened and how it happened completely conforms to Toby’s description.
If you accept what they’re saying as true, then that raises a troubling scenario, because we expect people to protest when they’re asked to do wrong. But Toby’s employees didn’t. What’s even more troubling is that according to Toby, it wasn’t just his employees: “I mean, we had to have assistance from other companies to pull this off,” he says.
To make it look like a real person closed on a real house, Toby needed a title company to sign off on the fake documents his staff had generated. And so after he got his staff onboard, Toby says he made some calls and basically made the same pitch he’d given his employees.
“It was, ‘Here is what happened. Here is the only way I know to fix it, and if you help me, great. If you won’t, I understand.’ Nobody said, ‘Maybe we’ll think about this. … Within a few minutes [it was], ‘Yes, I’ll help you.’ ”
So here we have people outside his company, agreeing to do things completely illegal and wrong.
Again, we contacted several of the title companies. No one would speak to us, but it’s clear from the legal cases that title companies were involved. One title company president ended up in jail because of his dealings with Toby; another agreed to a legal resolution.
So how could it be that easy?
Typically when we hear about large frauds, we assume the perpetrators were driven by financial incentives. But psychologists and economists say financial incentives don’t fully explain it. They’re interested in another possible explanation: Human beings commit fraud because human beings likeeach other.
We like to help each other, especially people we identify with. And when we are helping people, we really don’t see what we are doing as unethical.
Lamar Pierce, an associate professor at Washington University in St. Louis, points to the case of emissions testers. Emissions testers are supposed to test whether or not your car is too polluting to stay on the road. If it is, they’re supposed to fail you. But in many cases, emissions testers lie.
“Somewhere between 20 percent and 50 percent of cars that should fail are passed — are illicitlypassed,” Pierce says.
Financial incentives can explain some of that cheating. But Pierce and psychologist Francesca Gino of Harvard Business School say that doesn’t fully capture it.
They collected hundreds of thousands of records and were actually able to track the patterns of individual inspectors, carefully monitoring those they approved and those they denied. And here is what they found:
If you pull up in a fancy car — say, a BMW or Ferrari — and your car is polluting the air, you are likely to fail. But pull up in a Honda Civic, and you have a much better chance of passing.
“We know from a lot of research that when we feel empathy towards others, we want to help them out,” says Gino.
Emissions testers — who make a modest salary — see a Civic and identify, they feel empathetic.
Essentially, Gino and Pierce are arguing that these testers commit fraud not because they are greedy, but because they are nice.
“And most people don’t see the harm in this,” says Pierce. “That is the problem.”
Pierce argues that cognitively, emissions testers can’t appreciate the consequences of their fraud, the costs of the decision that they are making in the moment. The cost is abstract: the global environment. They are literally being asked to weigh the costs to the global environment against the benefits of passing someone who is right there who needs help. We are not cognitively designed to do that.
“I’ve never talked to a mortgage broker who thought, ‘When I help someone get into a loan by falsifying their income, I deeply consider whether or not I would destabilize the world economy,’ ” says Pierce. “You are helping someone who is real.”
Gino and Pierce argue that Toby’s staff was faced with the same kind of decision: future abstract consequences, or help out the very real person in front of them.
And so without focusing on the ethics of what they were doing, they helped out a person who was not focusing on the ethics, either. And together they perpetrated a $7 million fraud.
As for Toby, he says that maintaining the giant lie he’d created was exhausting day in and day out.
So in 2006, when two FBI agents showed up at his office, he quickly confessed everything. He says he was relieved.
Two years later, he was standing in front of the same judge who had sentenced his brother. A short time after that, he was in jail, grateful that his father wasn’t alive to see him, wondering how he ended up where he did.
“The last thing in the world that I wanted to do in my life would be to break that promise to my father,” he says. “It haunts me.”
Now if these psychologists and economists are right, if we are all capable of behaving profoundly unethically without realizing it, then our workplaces and regulations are poorly organized. They’re not designed to take into account the cognitively flawed human beings that we are. They don’t attempt to structure things around our weaknesses.
Some concrete proposals to do that are on the table. For example, we know that auditors develop relationships with clients after years of working together, and we know that those relationships can corrupt their audits without them even realizing it. So there is a proposal to force businesses to switch auditors every couple of years to address that problem.
Another suggestion: A sentence should be placed at the beginning of every business contract that explicitly says that lying on this contract is unethical and illegal, because that kind of statement would get people into the proper cognitive frame.
And there are other proposals, of course.
Or, we could just keep saying what we’ve always said — that right is right, and wrong is wrong, and people should know the difference.
Web story produced and edited by Maria Godoy; on-air story edited by Planet Money and Anne Gudenkauf.