Feeds:
Posts
Comments

Archive for the ‘Decision Making’ Category

A favorite Dilbert cartoon from a few years back has one character at a restaurant smugly insisting to his dining partner that he would never be so stupid as to provide his credit card details online.  Meanwhile he is paying the bill by handing his credit card to a waiter who disappears with it, supposedly only processing the dinner payment.

The cartoon illustrates how difficult it is to be consistently rational, i.e. rational whenever we should be rational, and being similarly rational in similar situations.   Even highly rational people have blind spots where they are not exercising their rational faculties on matters even they would think call out for rational assessment, and indeed don’t even realise that they are not doing so.

“Highly rational people” includes faculty members on admission committees of prestigious medical schools.

A “Perspective” piece in a recent issue of the top-shelf medical journal The Lancet, by Donald Barr of Stanford, describes how admission committees usually place a heavy emphasis on strong performance in science subjects, supposedly because those who are good at science will become good doctors and vice versa.   Barr decided to examine the evidence for this presumed correlation.   What he found, roughly, is that the evidence pointed the other way: the better you perform in undergraduate sciences, the worse you are as a doctor.  (Of course you should read Barr’s article for a more detailed and nuanced summary of the evidence.)

In other words, faculty members on admission committees of medical schools – the kind of people who would think of themselves as highly rational, who would readily stress the importance of taking proper account of the scientific evidence in medical practice – these faculty members were basing their admission decisions on a belief that was unfounded, erroneous, and harmful to their profession!

Barr’s scathing commentary is worth quoting at length:

If what we seek from our students is professional excellence, we must be careful to base the manner in which we select these students on scientific evidence, not on superstition. Beyond its religious connotation, The Oxford English Dictionary suggests that superstition represents an, “irrational or unfounded belief”.  The belief that one’s knowledge of science represents a continuous metric of the quality of one’s preparation for the study of medicine represents precisely such an “unfounded belief”. There seems to be no scientific evidence to support it. Great physicians base their professional practice on a threshold of scientific knowledge they have acquired throughout their career. Upon this foundation they build an artistic display of communication, compassion, empathy, and judgment. In selecting students for the study of medicine, we must be careful to avoid superstition, and to adhere to the evidence that equates as metrics of quality a preparation in fundamental scientific principles and the non-cognitive characteristics that are conducive to professional greatness.

Note that these medical faculty members’ admissions decisions are in an obvious sense “evidence-based.”  Each applicant would have provided a dossier of information (grade transcripts, letters of recommendation, etc.) and the learned professors would have been taking careful note of that evidence at least.

However, their method of making decisions was not adequately evidence-based. In adopting their method they had implicitly made judgments not  about students themselves but about the criteria for selecting students; and those judgements were not properly based on evidence.

It might be helpful to distinguish first-order from second order evidence based decision.   A first-order evidence based decision is one which properly takes into account the evidence relevant to the decision at hand.  This may include the particular facts and circumstances of the case, as well as more general scientific information.  So for example in a clinical context, the doctor making a first-order evidence-based judgement as to treatment would consider the available information about the patient as well as scientific information about the effectiveness of the treatment being recommended.

Now, taking evidence into account properly implies the existence of some method for finding and evaluating evidence and incorporating it into the final judgement.

A decision is second-order evidence-based when the choice of method is itself properly evidence-based.  Somebody making a decision which is second-order evidence based is considering (or has duly considered) not only the evidence pertaining to the decision at hand, but the evidence pertaining to the choice of method for making that type of decision.

[In theory of course there would be third-order evidence-based decision (evidence-based decisions about how to decide what method to use in making a decision), and so on.]

Donald Barr can be seen as urging that the medical profession be (or be more) second-order evidence-based in their admission decisions.  His recommendation is not that they take scientific data into account in any particular decision.  It is, rather, that they allow scientific evidence to shape their general framework for making admission decisions.

The distinction between first- and second-order evidence based decision is somewhat subtle.  In my experience it can be difficult for people to understand the distinction and appreciate the importance of being second-order evidence-based.

One experience of this, as it happens, was also with medical types.

I was contacted by a doctor – call him Dr. Smith – who was on a state committee whose job it was to spend millions of taxpayer dollars on fancy new medical equipment.  The committee received many applications and it had to decide which of those most deserved funding.  As he described it, the committee was using a relatively lightweight, informal version of standard multi-criteria decision making (list your criteria, weight your criteria, rate each option on each criterion, etc.).  Why were they using this method?  Apparently because “it seemed like the right thing to do” and “that’s they way we do it”.

Dr. Smith was concerned that the way in which the committee was making these decisions was insufficiently rigorous and transparent.  In particular he was worried that the process was not reliable enough, in the sense of treating similar cases the same way.  There was too much room for the slip and slop of unanchored subjective judgments.  This despite the fact that their decisions took into account a large amount of information and even scientific evidence – such as data about the benefits of certain types of treatments and technologies.

However Dr. Smith was a lone voice.  He was trying to persuade the chair of the committee, and other committee members, and the secretariat supporting the committee, that they should spend at least some time thinking not just about who should get the money, but about how they decide who should get the money.

I was invited to speak to the chair and the secretariat.  In a web-conference, I explained that there were many possible ways to make decisions of the kind they had to make.  Indeed there is an academic niche concerned with this very issue, i.e., how to make allocation decisions.   These methods have been subjected to some degree of scientific study, it is clear that some are better than others for particular types of decisions.   I explained that if they were not using the best method, they would likely be mis-allocating resources.  (Which means, in this kind of case, not just wasting money but not saving as many people from sickness and death as they might have.)

Then came the really tricky part.  I tried to gently explain that evidence-based selection of a decision method requires expertise, and not the expertise that they have as doctors, professors of medicine, or even medical administrators.  Rather, it is the expertise of a decision theorist.   Simply put, just being smart and have a fancy medical appointment doesn’t make you qualified to decide how to decide who should get funded.

As you can imagine, the conversation was polite but it went nowhere.   I was trying to explain to senior medical professionals that their decisions could be more rigorously evidence based.  They prided themselves on making evidence-based decisions, and they were making evidence-based decisions, but these were only first-order evidence based.  I was trying to convince them to be second-order evidence based (though I didn’t use those words).  Further, I was suggesting that their expertise wasn’t the sort of expertise that was really needed for their decision to be properly second-order evidence-based.

I had the feeling that they could hear the words but didn’t really “get” what I was saying.  Last I heard, nothing had changed in the way they were making their decisions.

Advertisements

Read Full Post »

In his recent post “To accept or to decline: mapping life’s little dilemmas using IBIS“, Kailash Awati provides a nice case study of using mapping to make a significant personal decision.   Interestingly, the “little dilemma” in the case study is just the same kind of issue that was facing Joseph Priestley when he wrote to Benjamin Franklin asking for his advice, resulting in Franklin’s famous letter describing his “moral algebra”.   Like most people, when reading Franklin’s letter I didn’t bother to ask what exactly was bothering Priestley so much that he would beg for Franklin’s advice.  As described in Steven Johnson’s excellent book The Invention of Air, Priestley was deliberating over whether to accept a particular job offer.  And as I discuss here, Franklin’s two-column pro/con method, for all its virtues, is just too simple to accommodate the true complexity of deliberative decision making.  Awati’s case study illustrates how the moral algebra can be extended to embrace this complexity while retaining clarity by using the IBIS methodology and supporting software such as Compendium or bCisive.

Read Full Post »

A version of  my Quadrant essay ” The Wise Delinquency of Decision Makers” was recently broadcast on ABC Radio National’s Ockham’s Razor program.  Audio and transcript available here.

Robyn Williams, the great science journalist  and host of Ockham’s Razor, introduced it thusly:

I remember a few years ago being on a committee choosing some prize winners, for innovation, or some such recognition. There was a new system in front of us, with grids, numbers out of ten in about twelve categories, cross-over elimination processes, all in all the same kind of chart you imagine The Pentagon using to invade Iraq in 2003.  Now I happened to know who should have won the main prize. It was obvious; Malcolm Gladwell wrote all about this in Blink, you just know. But the process didn’t allow any of that. We ground through the chart system methodically. Hours passed.  My winner wasn’t even in the shortlist. A list of the usual suspects was signed off. My guy later went on to win three national awards.  I quietly resigned from the committee.  So much for the gossip about decision making. Now the theory.

Robyn’s anecdote, and his instinctive antipathy to the officially mandated, matrix-based method, are certainly consonant with the themes of Wise Delinquency.  It is just a pity about the passing reference to Blink, a book whose beguiling readability concealed the simplistic and misleading nature of its main message.   My point  was not the Gladwellian idea that rigorous and painstaking methods should give way to gut feelings or intuitive “blinks”.   Rather, it was the certain kinds of rigorous and painstaking methods should not be applied by force in situations where qualitative deliberation is more appropriate.  And there are lots of such situations.

A recent article in Slate describes how one of the flagship examples in Gladwell’s book, the supposed predictive ability of marriage researcher John Gottman, only seems impressive because of egregiously lousy statistical methods.   These problems had been made public well before Blink was published.  I’m told that Gladwell had been informed of these problems, but apparently chose not to mention them; the truth of the matter would have interfered with a good story.   That seems like delinquency, but not of the wise kind.

Read Full Post »

The central responsibility, for Boards and for individual Directors, is to make good decisions.  What can Directors do to improve their decision making ability?

First, it is important to understand that decision making is a complex cognitive skill.  It is not an innate talent that some people were granted at birth.  Nor is it something that simply builds up with lots of experience – though experience is certainly relevant.

Rather, decision making is a cultivated skill, or in other words a domain of acquired expertise.   So the question becomes – how can Directors acquire more of this expertise, particularly with regard to the specific kinds of decision challenges that come before Boards?

We can look to contemporary cognitive science for some insights here.  A sub-field of cognitive science addresses the problem of expertise.   Scientists in this area are basically asking “How do people get really good at something?”

The dominant answer that has emerged over the past few decades is not really all that surprising.  To get really good, you need to do lots of good practice.   To be the best, you need to do the most, best quality practice.  The main reason Tiger Woods has been the top golfer is that he practised more, and more effectively, than anyone else.

For Directors, this means that to enhance your decision expertise, you need lots of practice making Board-type decisions.   Note that it is quality practice you need, not necessarily experience.   A Sunday golfer can accumulate lots of experience over the years, but their game never really improves because they are not working “on” their game.   In fact, lots of experience can make it harder to improve skills, because it can entrench bad habits.

This is where Julie Garland Mclellan’s “Director’s Dilemma” newsletter, and her new book Dilemmas Dilemmas, are so valuable.   They present lots of realistic case studies of decision problems of the kind that Directors regularly confront.  They give lots of opportunities for Directors to work on their game.

However, simply working through the case studies – and reading the “model answers” provided – may not be enough.   What really matters is how you practice.  And here the key insight is the obvious point that if you’re not in some way changing the way you do things, then your expertise will not improve.   To really get the benefit of working through case studies, you need to be expanding and refining your skill set.

Directors’ decision challenges, as illustrated in Dilemmas Dilemmas, are typically complex deliberative problems.  That is, they involve clarifying the problem, recognising a range of options and perhaps suboptions,  understanding the advantages and disadvantages of those options, and weighing up them up.   The core skill is disentangling and evaluating a complex set of issues and arguments.  How can a Director come to do this better?

One approach is to exploit the visual.  We know, in general, that when confronted with complexity, the human mind performs better with suitable visualisations.  A simple example: to understand how all the streets, train lines, etc., in a modern city are laid out, we use diagrams such as the maps found in a road atlas or, these days, the displays on GPS devices.

What is true for roads is even more true for deliberative decisions, which are abstract and indefinitely complex.   The human mind can cope more easily when the options, pros and cons, arguments and detailed evidence are laid out in a visually attractive, easily scannable form.   And the process of laying out the problem in this form can help introduce clarity and rigour.   The overall result is better understanding of the problem, better evaluation of the options, and, on average, better choices being made.

This approach to decision making has been pioneered by Austhink, who have developed a software tool (bCisive) to support the process.

Austhink has teamed up with Julie Garland Mclellan to offer workshops combining her Board expertise and case studies with Austhink’s methods and tools.  The workshops are intended primarily for “emerging” Directors – those who have recently become Directors, or hope to become Directors, or hope to take their directorial activity to a higher level.  Our ambition is that these workshops would be one of the most effective things that Directors – at any level of experience or expertise – could do to enhance their decision making expertise.   And if individual Directors can improve their game, Board decision making in general should also improve.

More information about the workshops.

Read Full Post »

I’m currently working on a book on decision mapping (and more generally, deliberative decision making), tentatively called Draw the Right Conclusion!.  I’ll be periodically releasing draft chapters.   First cab off the rank is the Introduction.

Comments and suggestions most welcome.

Here are the opening paragraphs:

In late 1772 Joseph Priestley was wrestling with a mundane problem.

Over a period of just a few years, Priestley had transformed himself from a little-known minister and teacher in towns of northern England into one of the most important scientists of his day.  He had published the History and Present State of Electricity, the first compendium of scientific knowledge in this new field, and the dominant textbook for the next hundred years.  His recent investigations had revealed one of the most profound aspects of life on earth: that plants make the air fit for us to breathe.  This work earned him the Royal Society’s Copley Medal, the Nobel Prize of his day.  Soon he was to isolate the substance that plants were providing, thereby playing a crucial role in the discovery of oxygen.

Yet the dilemma causing him so much anxiety was of a kind any of us might recognize.  Should he move with his young family from Leeds to Wiltshire?  He had been offered a kind of patronage by William Petty, the controversial Earl of Shelburne.  Shelburne would house the Priestleys at his estate, Bowood, and provide Joseph with a laboratory and time for research.  In return Joseph would be required to act as tutor to Shelburne’s sons and advisor to Shelburne himself.  Priestley had to resolve a personal conundrum laced with unknowns and incommensurabilities.  Would he be sufficiently free to pursue his intellectual passions?  Would his experimentation continue to bear fruit in the new environment?  Did he owe it to his family to accept the greater comfort and security attached to the new position?

Eventually Priestley turned for help to his friend and scientific colleague, the great American scientist and statesman Benjamin Franklin.  Franklin replied in a letter which has become a classic in the theory of decision making:

My way is to divide half a sheet of paper by a line into two columns; writing over the one Pro, and over the other Con. Then, during three or four days consideration, I put down under the different heads short hints of the different motives, that at different times occur to me, for or against the measure. When I have thus got them all together in one view, I endeavor to estimate their respective weights; and where I find two, one on each side, that seem equal, I strike them both out. If I find a reason pro equal to some two reasons con, I strike out the three. If I judge some two reasons con, equal to three reasons pro, I strike out the five; and thus proceeding I find at length where the balance lies; and if, after a day or two of further consideration, nothing new that is of importance occurs on either side, I come to a determination accordingly. And, though the weight of the reasons cannot be taken with the precision of algebraic quantities, yet when each is thus considered, separately and comparatively, and the whole lies before me, I think I can judge better, and am less liable to make a rash step, and in fact I have found great advantage from this kind of equation…

Franklin called this method a “moral algebra”: a kind of calculation, but one suited to human affairs, where often the stakes are large, the alternatives many, the considerations diverse and uncertain, and where your choice will be a test and reflection of your character.

Such as whether to get married, and in particular whether to marry your cousin…

Read the whole draft chapter…

Read Full Post »

Two recent publications have important implications for how Boards make decisions.  One is an academic treatise on how information is shared in teams.  A Board is a kind of team, working together to (among other things) make major decisions.  The practice of having a team make the big decisions is based on the idea that teams will, generally, make better decisions than individuals.  This is founded in turn on various assumptions:

  • Good decision making depends in part on taking into proper account relevant information;
  • Teams collectively possess more relevant information than individuals; and
  • Teams share and make use of that information in their deliberations.

    The authors of Information Sharing and Team Performance: A Meta-Analysis focused on this third issue.  They did a comprehensive review of existing studies on how teams share information, making a number of interesting findings.  If we extrapolate those findings to Boards, we can infer:

    • That sharing of information in Board meetings will indeed improve Board decisions.
    • However, Boards will generally not share information as effectively as they could.
    • In particular, Boards will tend to spend their time talking about what everybody already knows, rather than sharing important information that only a few people know.
    • In fact, the more there is a need for information sharing, the less information sharing will actually happen.
    • The more time the Board spends talk, the more they just rehearse what they already know.
    • Boards will share better if they think they are solving some kind of factual issue as opposed to making a judgement requiring consensus.
    • Boards share information better if they use a structured discussion process, rather than just indulging in the usual kind of spontaneous conversation.

    In short, there should be scope for Boards to improve their decisions by changing the way they conduct their discussions so as to promote better sharing of critical information.

    As it happens, a recent piece from McKinsey makes much the same point.  In “Using the crisis to create better boards” in the October 2009 issue of McKinsey Quarterly, the authors zero in on information sharing using structured techniques:

    “Chairmen can expose their boards to new sources of information – such as new performance benchmarks, new customer demands, or new financial perspectives – in many ways.  One involves tapping into the rich experience of nonexecutive and executive directors who also hold external appointments.  Each board member can be asked to share one fresh idea as part of a discussion about the company’s future…”

    The idea of going around the table asking everyone to contribute an idea is hardly very profound or original, and it is curious that leading management consultants, in the pages of the journal of one of the top shelf consulting firms, would be encouraging Boards of top organizations to make use of such a simple technique.   The fact that such a suggestion is seriously being made actually suggests that the issue of poor information sharing, discussed in abstract terms in the academic meta-analysis, is in fact a very real problem at the highest levels.

    Later in their piece, the authors get a little more specific about some of the information that needs to be shared and how to do it:

    Chairmen ought to help their boards…by requesting that all significant proposals come with a “red team” report presenting contrary arguments…the chairman would merely request that the board hear arguments for and against any important proposal.  The CEO would therefore have to think deeply before submitting the proposal, undecided board members could insist on a fuller discussion, and a rival paradigm might see the light of day.

    This suggestion is very much in line with our proposal that organisations improve Board deliberations, and hence decision making, by adopting decision mapping.   Decision maps, by their nature, include “the arguments for and against any important proposal,” though they include such arguments in a wider framework encompassing the overall structure of the decision.

    The McKinsey authors seem to be suggesting – and we would agree – that Boards don’t need more information thrown at them, in the form of door-stopping Board reports or dense PowerPoints.  Rather, they should look to benefit by more effectively sharing with each other the critical information and insights which they may already have, and understanding what difference that information makes to the issue.

    Read Full Post »

    Julie Garland Mclellan has posted another of her Director’s Dilemmas.  This time I had the interesting task of coming up with one of the three “answers” or commentaries, and used decision mapping to derive my recommendations.   Here is the map:

    donna

    Click on the image or here to view the zoomable pdf file.

    Transcribing the map into prose yields the commentary:

    Donna’s immediate issue is whether to accept the request to take the Chair….

    Donna has three main options: accept the Chair, decline, or escape the issue by resigning. On moral grounds she should accept, given that she is the most appropriate person – the other Directors have chosen her – and it would satisfy her sense of responsibility to the majority of shareholders. Greater demands and stress would be offset by increased remuneration and status.

    Once Chair, she would have at least three courses of action with regard to the troublesome Director. First, she should try to convince the trustee to replace the Director. Her “awareness” that the trustee would not support replacement sounds vague and may be ill-founded. Removing the troublesome director would resolve the crisis unless the trustee appoints another ill-suited person.

    Failing that, second, Donna should attempt to manage the situation. One option is to try to moderate the Director’s behaviour. The previous Chair’s failure suggests this is unlikely to succeed; however a fresh approach may work better. Donna should bear in mind the “fundamental attribution bias”, whereby we exaggerate the extent to which other peoples’ behaviour is driven by supposed personality traits rather than contingent circumstances. Another option is to contain the misbehaviour by meeting with other Directors and senior management to establish strong Board processes and norms.

    In blocking the troublesome Director’s ambition to be Chair, the Directors were accepting the risk of his forcing an EGM. Perhaps the situation will only be resolved in this manner. Consequently Donna must, third, prepare for the EGM so as to try to ensure the best outcome for the company.

    How does my commentary compare with the two others, which were by people with considerably more board experience?

    • There was a large overlap in the range of options that were considered across the three commentaries.
    • Julie’s commentary tended to offer more detail and nuance with regard to a selection of options that we both considered.

    These observations suggest (very anecdotally of course) that a systematic approach to decision making can to go a fair way towards making up for lack of domain knowledge; and more generally that there is such a thing as relatively generic, domain-independent expertise in thinking for decision making.   Of course, the best decision maker would combine both generic expertise and detailed domain knowledge.

    Read Full Post »

    « Newer Posts - Older Posts »