Featured

Even Pretty Models Can Give Ugly Results

All models are wrong; some are useful.

George Box

More and more, leaders of every sort of enterprise – from corporations to federal, state and local governments – are using mathematical models to help guide them in decision-making. Clearly, the US and UK governments’ approaches to dealing with the Covid-19 pandemic were greatly influenced by the model developed by Neil Ferguson of the Imperial College in London, and his co-workers. The calls for the Green New Deal stand (or fall) in part on the accuracy (or not) of the predictions of numerous global climate models. Many companies rely on weather models to guide important operating decisions. Most financial institutions (e.g., banks and esp. the Federal Reserve) rely on models to develop strategies for dealing with the future.

Leaders are increasingly relying on models because they are a convenient way to harmonize the cacophony of data that assails all of us daily. But as Mae West once said, “A model’s just an imitation of the real thing.” (For those of you who don’t remember Mae West, think of Dolly Parton smirking Nikki Glazer’s innuendo.). Like a Monet landscape, a model accentuates certain facets of reality, ignores others and, sometimes, fills in blank spaces that can’t be seen. Thus, though produced by scientists, there is a certain art in crafting a model – what to include, what to ignore, how to bridge regions where data may not be available.

The snare facing a decision maker in using the results of a mathematical model is that even the most elegant of models may mislead. The modeler, like Monet, has made choices about what data to include. If the model does not represent all of the data relevant to the decision to be made, then its usefulness is suspect. Decision makers need some sort of user’s guide to avoid that snare.

In my career, I have both developed and used models developed by others (usually successfully!). I have learned that the precision of a model’s results provide an illusion of certainty; i.e., the results may have three decimal places, but sometimes can only be relied upon within a factor of ten. Along the way, I’ve developed a few rules of thumb that have served me well in using the results of mathematical models. I generally use these in the form of questions I ask myself.

What was the model developed for? If the model was developed for a different purpose, then I have to satisfy myself that the model is appropriate for the decision I have to make – e.g., what data were included; what were omitted. If the model was developed for a different purpose, I need to dig into what important facets of my situation may not be represented in the model.

Has the model been successfully used before for my purpose? In the case of the Imperial College infectious disease model, it was developed to look at deaths from SARS and other infectious diseases; thus, presumably it is suitable for its use in the current pandemic. However, the model’s previous predictions of fatalities were off by orders of magnitude. Almost certainly, its predictions are upper bounds; however, they are so high that their usefulness is questionable.

Is my situation included within the bounds of the model? The Federal Reserve’s actions to respond to the pandemic are being driven, in part, by econometric models based on past history. Clearly, however, the usefulness of those models is open to debate – we’ve never been in this situation before – it’s like asking a blind man to paint a landscape. This can be very important when two or more models are coupled, e.g., modeling economic changes based on the results of a climate change model. If the climate change model’s results are based on an implausible scenario (RCP 8.5) then the results of the economic model are highly suspect.

What is the uncertainty associated with the model’s results? In some cases, the uncertainty is so large that the models results are not useful for decision-making. And if the modeler can’t tell me how certain/uncertain the model’s results are, that’s a huge “Caution” flag.

How sensitive are the model’s results to variability in its inputs (e.g., initial conditions)? This is of crucial importance when considering large-scale mathematical models of complex phenomena (e.g., climate change). If the model’s results are very sensitive to its inputs, then the model’s input must be known very precisely. If the model developer has not performed a sensitivity analysis, another “Caution” flag goes up.

Has the model been validated in some way? This can be done in a variety of ways, but my order of preference is:

  1. Showing that model outputs are in reasonable accord with a real-world data set. “Reasonable” means that the agreement is good enough I am convinced I can use the model’s results for my situation to make good decisions.
  2. Showing that each piece of the model is consistent with established principles. In some cases, there are no real-world data for comparison. If not, I want the modeler to be able to demonstrate that the algorithms in the model are consistent with accepted principles. This is fairly straightforward for physical phenomena unless the model assumes that they are coupled. It is much less so when one brings in social science constructs.
  3. (actually down about #22 on my list). Peer review. Sometimes modeling results from peer-reviewed journal articles are offered as guides for decision-making. If the model has not been otherwise validated, I am wary in using its results. Peer review is not what it used to be (if it ever was!) . I see it all too often becoming the last refuge of scoundrels – friends approving friends’ papers with limited review. The failed experiment of replicating some of the most widely accepted results in psychological research (less than half could in fact be replicated); the David Baltimore scandal; and too many others lead me to accept peer review by itself as validation only if I have no other choice.

Our leaders – at all levels – are increasingly relying on the results of a wide variety of models as decision-making aids. Often these are held up by experts as “the science” that must be followed. And yet, even the most elegant – the prettiest – of models may mislead. If a model’s results are accepted without question, the consequences for the community may be quite ugly. The wise leader trusts, but verifies by asking simple questions such as these.

Featured

Leadership

Leadership is a matter of intelligence, trustworthiness, humaneness, courage, and sternness.

Sun Tzu

This year has tested leaders at all levels in ways they never could have imagined.  A pandemic spawning an economic crisis, coupled with widespread social unrest.  One has to wonder if a plague of frogs is next!

Effective leadership is essential for community resilience.  While we all recognize what a leader should do, we often overlook what a leader should be – those attributes necessary for effective leadership.  The Art of War – the two millenium old classic Chinese treatise on war by Sun Tzu – has much to offer us as we try to understand what is needed for effective community leadership. 

According to Sun Tzu, a successful leader must have the five traits listed above.  In the context of a community and its resilience, these traits might be better described as follows.

Intelligence.  Intelligence in leadership means that the leader knows how to clearly identify an objective, communicate it, plan to achieve it and then mobilize the resources needed to actually achieve the objective.  This implies that an intelligent community leader recognizes when the community must adapt to changing circumstances.  The intelligent leader is able to articulate that need and initiate the planning effort needed to affect change.  The efforts of city leaders in southeast Florida to adapt to rising seas are good examples.

Trustworthiness.  A trustworthy leader is recognized by the community as a person of integrity.  Thus, the community believes that the leader will carry out promised actions, and will provide support to the rest of the community to implement action plans.  Such a leader is thus able to communicate more effectively to the larger community, because even unpopular messages are more likely to be heard.  The public’s trust in Mayor Latoya Cantrell has played an important role in both limiting the coronavirus death toll in New Orleans, and in dampening the potential for violence.

Humaneness.   A humane leader cares about the community, and that caring is manifested in actions.  The community believes that the leader “feels their pain,” and therefore is more likely to follow where the leader is going.  This recognized innate humaneness of the leader is especially important when trying to reconcile different factions within the community.  Since mobilizing human and social capital is so important for action, humaneness

Courage.  A leader must have the courage to persevere even when obstacles are encountered.  In essence, the courage needed by an effective leader is born of a certain innate confidence in one’s own integrity and intelligence – the leader believes the community is on the right course.

Sternness.  By “sternness,” Sun Tzu means a sort of rigorous fairness.  Rewards and punishments are strictly based on actions, not the person acting.  Ultimately, this sternness is the result of a sort of self-discipline in which the leader may have favorites but does not favor them. It inherently results in leadership that holds itself responsible, and does not fear to hold others accountable for their actions.

Many of the commenters on The Art of War have stressed the danger of valuing one of these above the others. For example, excessive humaneness (think empathy) can lead to either weakness or paralysis; courage to foolhardiness. Excessive sternness can lead to cruelty; intelligence to arrogance. Leaders thus should strive for an Aristotelian balance of these attributes.

The transformation of Charlotte, NC, from a textiles to a financial center illustrates the importance of several of these leadership traits.  Up until the 1970’s, Charlotte had been one of the leading centers for the textile industry in the country.  The heads of two of the largest banks in North Carolina and the head of Duke Power recognized that the demise of that industry threatened Charlotte’s vitality.  All three were embedded in the community, and had earned its trust. All three passionately cared about Charlotte’s future, and their their caring about the city’s future was widely recognized by the public.  Acting largely independently of city and county governments, these three formed an organization aimed at helping Charlotte adapt to these changing conditions.  As plans were developed, these three spearheaded the transformational effort.  They helped rebuild some of the poorest sections of the city (encountering opposition because many of these were predominately black), courageously turning what had been almost slums into desirable neighborhoods.  In spite of criticism and carping, these three eventually transformed Charlotte into what has become the second largest financial center in the country.

Many of our communities and our country are embroiled in painful and often rancorous debates about racism, inequality and our future.  Effective leadership is essential if we are to emerge from the acrimony and build the better future we all want.  Sun Tzu’s wisdom can point us toward those leaders likely to be effective. Leaders who have the intelligence to see the problems and to recognize real solutions. Leaders with the recognized trustworthiness and passion to move the community forward. Leaders who care enough and are courageous enough to enlist the entire community; yet disciplined enough to hold themselves and everyone else accountable.

Featured

A User’s Guide to Expert Advice

All your knowledge is about the past and all your decisions are about the future.

Ian Wilson

The Mayor of my small city owns a short string of dry cleaners. He sort of galumphs around town like a latter-day Bullwinkle. He’s a small-town avatar of Alfred E. Neuman.

His job – like that of all leaders – is to mobilize resources and get the right things done. But in our complex world, it’s often difficult for a leader to know what the right things are. Decisions must be made in realms in which community leaders have no experience or expertise. Thus, they must rely on the advice of experts for guidance.

For a decade or so, I was the recognized technical expert in a field fraught with technical challenges and political minefields, where decisions sometimes involved hundreds of millions of dollars. I later led a multi-million dollar enterprise, where I had to rely on the expertise of others. Having both provided expert advice and used others’ expertise to make important decisions, let me share a few lessons I’ve learned.

Experts advise, leaders decide. This is the most important lesson I’ve learned! Simple, isn’t it – but packed with meaning. First and foremost, a leader needs to define victory – what is the desired outcome. We all want to make data-informed decisions, but that means that the leader needs to lay out the context for the decision. If not, the expert’s advice may not only be misdirected but might lead to unintended consequences. In my experience, describing one or more desirable end-states and asking how to achieve them gives better results than effectively limiting the expert’s response.

Let me use a climate change example. South Florida has seen rising sea levels, and increasing numbers of King Tides and flooding of low-lying areas. Area leaders are united in wanting to limit the impacts of flooding on their communities. One leader might go to his experts and ask how to prevent roads from flooding. Experts might answer that roadways should be elevated. And – voila! – it works, except that now the flooding is in residents’ yards and houses. A better approach might be to ask how to limit flooding so that it did not impact either transportation or disrupt people’s lives, leading to changes in land use and better water shed management.

Another example. The US approach to the pandemic has been to “flatten the curve,” i.e., victory was defined as no Covid-19 deaths due to lack of appropriate medical care. The Swedish approach has focused on protecting the most vulnerable while not intruding too much on daily life. The rising tide of deaths of despair (not to mention medical procedures delayed too long), the economic upheaval and the social unrest we in the US are already experiencing may indicate that Swedish leaders were better at defining victory. No matter which approach ends up with a better end-state, this highlights the importance of carefully defining victory.

One other point to remember. Decisions, particularly public policy decisions must transcend domain expertise by considering other factors. It’s not enough to follow the science. Legal, economic (resulting unemployment) and social impacts (e.g., potential domestic violence, increased deaths of despair) and squishy things like values and the public’s expectations must also be considered. Thus, for decisions with broad societal ramifications, a leader needs advice from a number of different disciplines. The leader has to balance their different perspectives and try to craft a decision that comes as close as possible to the desired outcome. Ultimately, that’s why it’s lonely at the top – leaders often have to implement decisions in an uncertain environment.

Experts are human – usually. Man is a pattern-seeking animal. One of the keys to our survival as a species is that we see patterns (for example, of potential danger) and act on them (e.g., run like hell in the opposite direction). An expert’s advice is most often based on the pattern the expert perceives. Experts’ expertise is thus simply the sum of their experiences, i.e., what they have seen and what they have learned. Sometimes we take that to mean that an expert has to be some hoary old fart who’s been around for years. While it does take time to develop expertise, it’s really more a matter of how much has been learned rather than the time spent learning it (i.e., it’s better to have learned from thirty different experiences, than to have experienced the same thing thirty times).

In my case, I was relatively young (and in a young field) but had seen – and caused! – a lot of failures and had worked hard to learn their causes and prevention. In my case, that bred a great deal of humility – I recognized that I probably knew more than most others, but also recognized how little I knew compared to all that there was to know. Leaders should beware of experts who think that they know it all. They’re likely to introduce cognitive biases into their advice (e.g., cherrypicking data; ignoring facts that don’t fit their preconceptions).

Just as dangerous is the expert who recognizes the uncertainties within a given situation (i.e., can’t find a pattern) but defaults to some other basis for advising leaders (e.g., an unvalidated computer model; use and abuse of models will be the subject of a later post). Too often, these situations result in a sort of “Groupthink” – where experts cluster around a single concept that they might individually not support so strongly.

I’ve learned one other useful lesson: experts, like the rest of us, have biases that may not match those of the decision-maker. I’m sure many of you have been in the situation where a consultant has recommended a course of action that would benefit him, but might or might not achieve victory. The expert may recommend a very conservative course of action so that there is little danger of the expert being proven wrong in his field – the expert can claim success whether or not he’s pointed to the best path. If the expert is part of an entrenched bureaucracy, she may tilt her advice so that it benefits her organization.

Hedgehogs and foxes. Philip Tetlock popularized a concept that dates back to the Greek poet-warrior Archilocus – there are those that know lots of little things (foxes), and those that know one big thing (hedgehogs). This applies to experts as well. Each type has its strengths and weaknesses. Foxes are more likely to foresee potential unintended consequences of a proposed action than hedgehogs, and to be collaborative. However, hedgehogs’ advice may reflect their deeper understanding within a given situation, and thus be superior for bounded problems. Conversely, hedgehogs often are guilty of “epistemic trespassing,” believing that their expertise in one discipline makes their opinion of great value in another.

In my experience, experts from a specialized organization tend to be hedgehogs; broader organizations provide the broader range of experiences needed for foxes. The decision-maker needs to remember that while life is not stovepiped, bureaucracies are – the best advice comes from a competitive intellectual market, involving both foxes and hedgehogs. In the words of Dr. Li Wenliang, who tried to warn the Chinese government of the dangers of Covid-19, “There should be more than one voice in a healthy society,” i.e., an effective decision should have broad input.

To use – or not to use. Let me briefly close with an echo of something I touched on above. The decision-maker is responsible – and accountable – for decisions made, not the expert. Saying that “I’m just following the Science” is a copout and an abdication of responsibility. In my career, I’ve tried to determine whether to follow expert advice based on three factors:

• My trust in the expert or group of experts. This entails factors such as their inherent biases, their track record, their confidence in their conclusions and their consideration of potential unintended consequences.
• The inherent quality of the advice. This entails factors such as how well the current situation seems to match the experts’ assumptions and the experts’ appropriate consideration of uncertainties.
• The fit to the decision. The experts may give me great advice but it’s up to me to determine whether it actually will lead to victory as I’ve defined it.

All leaders eventually are faced with decisions which transcend their own experience. The increasing complexity of our communities, and the unprecedented challenges they face, require that the leaders of our communities receive expert advice. But those leaders must recognize that they are the ones who will make the decisions, and who will be held accountable for their results. I hope that these extracts from my own experience will help leaders better utilize experts and their expertise, and to make better decisions based on expert advice.

========

A brief postscript. This week, we surpassed 100,000 deaths from the coronavirus in the US. On Memorial Day, we also honored those whose lives were sacrificed in service to their country. We can best honor those whose lives were lost from the virus by learning – and acting on – the lessons their loss can teach us. We need to do this with eyes not shaded by party or prejudice, and with a clear intent to not walk down this path again.