Book Review: The Art Of Thinking Clearly

April 29, 2025

 

As humans, we all are subject to making decisions based on unreasonable or unfair judgments (biases), or acting on mistaken beliefs based on unsound arguments (fallacies).

Such cognitive errors are not always “toxic, … some are even necessary for leading a good life,” said Rolf Dobelli, a philosophy and business administration scholar, in his 2013 book “The Art of Thinking Clearly.” And for better or worse, “they are far too ingrained for us to rid ourselves of them completely … Silencing them would require superhuman willpower.”

“The Art of Thinking Clearly” highlights cognitive errors that influence business leaders’ decisions. Realizing their existence is essential, as they often are “routine mistakes, barriers to logic we stumble over time and again,” Dobelli wrote.

Perception, perception

“Confirmation bias is the mother of all misconceptions,” says the book. In a nutshell, it is a person’s inclination to align new information with existing theories, beliefs, and convictions. “We filter out any new information that contradicts our existing views.”

That bias sees CEOs dismiss failures as the result of external events unrelated to business, even though they “often hide the presence of discomforting evidence” of weakness in companies. 

“Confirmation bias” also impacts an individual’s perception of the world. “[People] browse news sites and blogs, forgetting that our favored pages mirror our existing values,” the book says. “Whether you [believe] that people are inherently good or inherently bad, you will find daily proof to support your case.” 

Another perception error is “availability bias,” where individuals use available information to “systematically overestimate risks like being the victim of a plane crash [while] underestimating the risk of dying from … diabetes” because the former gets more media attention.

Meanwhile, the “salience effect … ensures that outstanding features [of an event] receive much more attention than they deserve.” That “influences not only how we interpret the past, but also how we imagine the future.” 

The “Forer effect” manipulates perception by using vague flattering statements, such as “You have amazing intellect” or “You are an amazing independent thinker,” which the average person can’t argue against, giving them unwarranted confidence. 

Another fallacy is the “illusion of attention,” where “we are confident that we notice everything that takes place in front of us. But in reality, we often see only what we are focusing on,” the book says. This fallacy ties in with “news illusion,” where the book argues that an average person spends too much time reading the news, resulting in “immense loss of productivity,” as almost nothing they read affects their daily life.

Those two illusions lead to “introspection illusion,” where “we are so confident in our beliefs [that] when someone fails to share our views,” we assume “ignorance, … idiocy, [or] malice.” That could ultimately lead to “cognitive dissonance” which is when people lie to themselves that they make correct decisions. 

Attention, news and introspection illusions manifest themselves when a person falls for “chauffeur knowledge” and “forecast illusion.” Those fallacies state that people who casually accumulate knowledge won’t likely make accurate forecasts. Chauffeur knowledge accumulators “reel off eloquent words as if reading from a script,” the book says, acting as if the  expert knowledge they “espouse is … their own.”

Success and failure

Another major category of biases and fallacies relates to a person’s assessment of their chances of success. “Survivorship bias,” “base-rate neglect,” and “neglect of probability” arise when individuals focus only on success stories rather than the number of companies that failed versus the number that succeeded. 

Those errors happen “because triumph is made more visible than failure,” making individuals “systematically overestimate [their] chances of succeeding. You succumb to an illusion, [overlooking] how minuscule the probability of success really is.”

Alternatively, the “intention-to-treat” fallacy happens when “failed projects … show up prominently” but appear in the wrong category. For example, banks claim that indebted companies are more profitable. In reality, banks quickly sell the assets of defaulting firms, removing them from their “failed projects” category. 

When success does come, entrepreneurs may fall victim to “inductive thinking” and “illusion of skill,” believing their superior abilities have allowed them to succeed. Dobelli argues that “luck plays a bigger role than skill.” The book says luck is most evident in the “beginner’s luck” fallacy, which encourages “amateurs [to] increase the stakes.”    

Prolonged success spells could lead to believing “false truths,” such as when banks believed real estate prices could never go down before the 2008 global financial crisis.

Ultimately, CEOs fall for “self-serving bias,” taking personal responsibility for the company’s success, but blaming external factors for failure. Reasons include “the unfortunate exchange rate, governmental interference, malicious trade practices … various hidden tariffs, subdued consumer confidence, and so on.” 

Chosen one

Another set of biases and fallacies claims startups can mimic the success of superstar companies. The book claims hard work alone may not be enough, stressing that some individuals were born with the resources to succeed. “Whenever we confuse selection factors with results, we fall prey to … swimmer’s body illusion.” 

Meanwhile, customers may fall for the “halo effect,”  where “we take a simple-to-obtain or remarkable fact or detail … and extrapolate conclusions.” One example is “when we favor products from a manufacturer simply because of its good reputation.” Advertising has found “an ally in the halo effect” when it associates a product with happiness, for example. 

“Fundamental attribution” error comes from “the tendency to overestimate individuals’ influence and underestimate external factors.”

Another error in judgment is “self-selection bias,” where a person complains about bad luck. It also is evident when marketers want to assess a product’s quality, so they send surveys to existing users who “are clearly satisfied, have time to respond, and have not canceled their subscriptions.” 

Executives and CEOs also fall into “alternative blindness,” failing to consider or even see all available options.

Lastly, “clustering illusion” and “coincidence” plague stock investors, in particular, as they make correlations between investment options that don’t exist. For example, an investor believes “if share prices and oil prices climb or fall in unison, gold will rise the day after tomorrow.”

Lastly is the “illusion of control,” where individuals “believe that they can influence something over which they have absolutely no sway.” 

The problem with teams

“The Art of Thinking Clearly” argues that biases and fallacies can thwart the work of teams. “Social proofing,” “groupthink,” and “authority bias” all revolve around the pressures a single team member faces if they don’t agree with most team members or the charismatic leader.

Team members also fall victim to “in-group, out-group bias.” According to Dobelli, stereotypes and prejudices against non-team members stem from the “out-group bias.” Meanwhile, in-group members “receive a disproportionate amount of support … This distortion is dangerous, especially in business,” Dobelli says, leading to “organizational blindness.”

Another issue with having teams working on the same tasks is “social loafing,” where individual members minimize their efforts while relying on other members to pick up the slack. “Social loafing is a form of cheating,” the book said. “Input doesn’t grind to a complete halt [as] zero performance would be noticed, and it brings with it weighty punishments, such as exclusion from the group or vilification.”

A step too far

“Sunk cost fallacy,” “it’ll-get-worse-before-it-gets-better fallacy,” “effort justification,” and “outcome bias” surface when a corporation fails to abandon a project if results are consistently “negative or expected to get worse.”

In such situations, a decision-maker argues, “We’ve invested so much money [and time] in it,” Dollei said. “If we stop now, it’ll all be for nothing.” To justify their persistence and investments, CEOs and investors tend to “overvalue the results.” Additionally, those decision-makers “tend to evaluate decisions based on the results rather than the decision process.” 

Once CEOs or investors make a decision, they must abandon other options (Inability to close doors fallacy). This is most evident with “companies that aim to address all customer segments end up addressing none.”

Also related is the “endowment effect,” where “we consider things to be more valuable the moment we own them,” the book says. “If we are selling something, we charge more for it than what we … would be willing to spend.”

Communications fallacies

Another set of cognitive errors comes from interacting and transacting with business associates and customers. The book points to the dangers and benefits of “reciprocity.”

On the upside, reciprocity “is a very useful survival strategy, a form of risk management,” as it guarantees “cooperation between people who are not related to each other, and is a necessary ingredient for economic growth and wealth creation.”

The downside is reciprocity could instigate “full-scale wars” involving trade, for example. On a corporate level, it pressures gift recipients to agree to the other side’s requests. 

Companies “frame” communication by focusing on absolute figures when the news is good and percentages if the news is negative, the book says. Another framing illusion is to emphasize the positives, such as saying meat is 99% fat-free rather than saying it has 1% fat, Dobelli says. 

That has opened a new world of lingo. “A tumbling share price becomes a correction … an overpaid acquisition price is branded goodwill … a problem magically transforms into an opportunity or a challenge, [and] a person fired is reassessing his career.”

“Liking bias” is another error in judgment in which a transaction is executed based on personal feelings, not technical competencies.

The “incentive super-response tendency” occurs when an executive uses an incorrect metric to reward employees. The most vivid example is paying a service provider by the hour. Accordingly, providers are vested in extending work hours for as long as possible to get higher pay.

In conclusion, Dobelli stresses, “Nature doesn’t seem to mind if our decisions are perfect or not, as long as we can maneuver ourselves through life — and as long as we are ready to be rational [at the right time]” and can identify and develop a “circle of competencies.”

This story first appeared in April’s print edition of Business Monthly.