Most of us probably wouldn’t knowingly spread false information on social media. Under certain circumstances, however, we may be willing to accept or even promote false statements, a recent study has found.

We are more likely to find misinformation acceptable if we believe it could eventually become true in the future — even if it is false in the present. We are also likely to put our ethical concerns about an unrealistic set of glowing credentials on a resume, a business’s overblown description of its services or a politician’s opinionated rhetoric on hold if we believe they may eventually become true, even if they are clearly not true now.

Leaders in business and politics claim false outcomes could become true to justify statements that are verifiably false in the present, a kind of “hey, it could happen” approach. Because they refer to a possible but unverifiable future, these statements often go unquestioned.

“Misinformation in part persists because some people believe it. But that’s only part of the story,” Beth Anne Helgason, one of the authors of a study about why false information often goes unchallenged, explained in a statement. “Misinformation also persists because sometimes people know it is false but are still willing to excuse it.”

Leaders in business and politics often use claims that clearly false outcomes could become true to justify statements that are verifiably false in the present, a kind of “hey, it could happen” approach. And because they refer to a possible but unverifiable future, these statements frequently go unquestioned and are shared on social media, the London Business School study found.

In a series of six experiments involving more than 3,600 participants, researchers presented students with statements clearly identified as false on a variety of topics. Some participants were asked to consider predictions about how the statements might become true in the future.

In one study, nearly 450 business school students were told to imagine that a friend lied on their resume, claiming a skill despite having no prior experience. Some participants were then asked to consider the possibility of the lie becoming true — that the friend would soon enroll in a course to gain that skill. The students given this scenario ended up thinking that the friend’s lie was less unethical when they imagined whether their friend might develop this skill in the future.

In another study, nearly 600 American participants viewed six statements clearly labeled as false by fact checkers. Some of the statements were designed to appeal to liberals: “The average top CEO makes 500 times more than the average worker.” Others appealed to conservatives: “Millions of people voted illegally in the last presidential election.”

Next, participants were asked to generate their own predictions about how each statement might become true in the future. In one instance, they were told, “It’s a proven fact that the average top CEO currently makes 265 times more money than the average American worker.” Then they were asked to respond to the open-ended prompt, “The average top CEO will soon make 500 times more money than the average American worker if …”

Those on both sides of the political aisle were less likely to rate such baseless statements as unethical if they believed their broader meanings were true, especially when the false statement fit with their political views. The participants always knew the statements were false, yet imagining how they might become true made people find them more acceptable.

And they continued to excuse the false statements and find them ethical even when they had been instructed to think before judging them. As co-author Daniel Effron, a professor of organizational behavior at the London Business School, said, “Our findings are concerning, particularly given that we find that encouraging people to think carefully about the ethicality of statements was insufficient to reduce the effects of imagining a future where it might be true. This highlights the negative consequences of giving airtime to leaders in business and politics who spout falsehoods.”

It was when people imagined that misinformation might become true, the researchers found, they were likely to share that misinformation on social media — but only if it reflected their political views. “Our findings reveal how our capacity for imagination affects political disagreement and our willingness to excuse misinformation,” Helgason, a doctoral student at the London Business School, explained. “Unlike claims about what is true, propositions about what might become true are impossible to fact-check. Thus, partisans who are certain that a lie will become true eventually may be difficult to convince otherwise.”

The study is published in the Journal of Personality and Social Psychology.