Feeds:
Posts
Comments

Archive for the ‘Expertise’ Category

Almost everyone agrees that critical thinking skills are important.  Almost everyone agrees that it is worth investing effort (in education, or in workplace training) to improve these skills.   And so it is rather surprising to find that there is, in the academic literature, little clarity, and even less consensus, about one of the most basic  questions you’d need answered if you wanted to generate any sort of gains in critical thinking skills (let alone generate those gains cost-effectively); viz., how are critical thinking skills acquired?

Theories on this matter come in five main kinds:

  • Formal Training. CT skills are simply the exercise of generic thinking power which can be strengthened by intensive training, much as general fitness can be enhanced by running, swimming or weightlifting.  This approach recommends working out in some formal ‘mental gym’ such as chess, mathematics or symbolic logic as the most convenient and effective way to build these mental muscles.
  • Theoretical Instruction. CT skills are acquired by learning the relevant theory (logic, statistics, scientific method, etc.).  This perspective assumes that mastering skills is a matter of gaining the relevant theory.  People with poor CT poor skills lack only a theoretical understanding; if they are taught the theory in sufficient detail, they will automatically be able to exhibit the skills, since exhibiting skills is just a matter of following explicit (or explicable) rules.
  • Situated Cognition. CT is deeply tied to particular domains and can only be acquired through properly “situated” activity in each domain.  Extreme versions deny outright that there are any generic CT skills (e.g. McPeck).  Moderate versions claim, more plausibly, that increasingly general skills are acquired through engaging in domain-specific CT activities.  According to the moderate version general CT skills emerge gradually in a process of consolidation and abstraction from particular, concrete deployments, much as general sporting skills (e.g., hand-eye coordination) are acquired by playing a variety of particular sports in which those general skills are exercised in ways peculiar to those sports.
  • Practice sees CT skills as acquired by directly practicing the general skills themselves, applying them to many particular problems within a wide selection of specific domains and contexts.  The Practice perspective differs from Formal Training in that it is general CT skills themselves which are being practiced rather than formal substitutes, and the practice takes place in non-formal domains.  It differs from Situated Cognition in that it is practice of general skills aimed at improving those general capacities, rather than embedded deployment of skills aimed at meeting some specific challenge within that domain.
  • Evolutionary Psychology views the mind as constituted by an idiosyncratic set of universal, innate, hard-wired cognitive capacities bequeathed by natural selection due to the advantages conferred by those capacities in the particular physical and social environments in which we evolved.  The mind does not possess and cannot attain general-purpose CT skills; rather, it can consolidate strengths in those particular forms or patterns of thinking for which evolution has provided dedicated apparatus.  Cultivating CT is a matter of identifying and nurturing those forms.

Formal training is the oldest and most thoroughly discredited of the perspectives.   It seems now so obvious that teaching latin, chess, music or even formal logic will have little or no impact on general critical thinking skills that it is hard to understand now how this idea could ever have been embraced.   And we also know why it fails: it founders on the rock of transfer.  Skills acquired in playing chess do not transfer to, say, evaluating political debates.  Period.

Theoretical Instruction has almost as old a philosophical pedigree as Formal Training.  It has been implemented in countless college critical thinking classes whose pedagogical modus operandi is to teach students “what they need to know” to be better critical thinkers, by lecturing at them and having them read slabs out of textbooks.   Token homework exercises are assigned primarily as a way of assessing whether they have acquired the relevant knowledge; if they can’t do the exercises, what they need is more rehearsing of theory.   As you can probably tell from the tone of this paragraph, I believe this approach is deeply misguided.  The in-depth explanation was provided by philosophers such as Ryle and Heidegger who established the primacy of knowledge-how over knowledge-that, of skills over theory.

Current educational practice subscribes overwhelmingly (and for the most part unwittingly) to the moderate version of Situated Cognition.  That is, we typically hope and expect that students’ general CT skills will emerge as a consequence of their engaging in learning and thinking as they proceed through secondary and especially tertiary education studying a range of particular subjects.  However, students generally do not reach levels of skill regarded as both desirable and achievable.  As Deanna Kuhn put it, “Seldom has there been such widespread agreement about a significant social issue as there is reflected in the view that education is failing in its most central mission—to teach students to think.”  In my view the weakness of students’ critical thinking skills, after 12 or even 16 years of schooling, is powerful evidence of the inadequacy of the Situated Cognition perspective.

There may be some truth to the Evolutionary Psychology perspective.  However in my view the best argument against it is the fact that another perspective – Practice – actually seems quite promising.   The basic idea behind it is very simple and plausible.   It is a truism that, in general, skills are acquired through practice.   The Practice perspective simply says that generic critical thinking skills are really just like most other skills (that is, most other skills that are acquired, like music or chess or trampolining, rather than skills that are innate and develop naturally, like suckling or walking).

In our work in the Reason Project at the University of Melbourne we refined the Practice perspective into what we called the Quality (or Deliberate) Practice Hypothesis.   This was based on the foundational work of Ericsson and others who have shown that skill acquisition in general depends on extensive quality practice.  We conjectured that this would also be true of critical thinking; i.e. critical thinking skills would be (best) acquired by doing lots and lots of good-quality practice on a wide range of real (or realistic) critical thinking problems.   To improve the quality of practice we developed a training program based around the use of argument mapping, resulting in what has been called the LAMP (Lots of Argument Mapping) approach.   In a series of rigorous (or rather, as-rigorous-as-possible-under-the-circumstances) studies involving pre-, post- and follow-up testing using a variety of tests, and setting our results in the context of a meta-analysis of hundreds of other studies of critical thinking gains, we were able to establish that critical thinking skills gains could be dramatically accelerated, with students reliably improving 7-8 times faster, over one semester, than they would otherwise have done just as university students.   (For some of the detail on the Quality Practice hypothesis and our studies, see this paper, and this chapter.)

So if I had to choose one theory out of the five on offer, I’d choose Practice.  Fortunately however we are not in a forced-choice situation. Practice is enhanced by carefully-placed Theoretical Instruction.  And Practice can be reinforced by Situated Cognition, i.e. by engaging in domain-specific critical thinking activities, even when not framed as deliberate practice of general CT skills.   As one of the greatest critical thinkers said in one of the greatest texts on critical thinking:

“Popular opinions, on subjects not palpable to sense, are often true, but seldom or never the whole truth. They are a part of the truth; sometimes a greater, sometimes a smaller part, but exaggerated, distorted, and disjoined from the truths by which they ought to be accompanied and limited.”

Advertisements

Read Full Post »

A favorite Dilbert cartoon from a few years back has one character at a restaurant smugly insisting to his dining partner that he would never be so stupid as to provide his credit card details online.  Meanwhile he is paying the bill by handing his credit card to a waiter who disappears with it, supposedly only processing the dinner payment.

The cartoon illustrates how difficult it is to be consistently rational, i.e. rational whenever we should be rational, and being similarly rational in similar situations.   Even highly rational people have blind spots where they are not exercising their rational faculties on matters even they would think call out for rational assessment, and indeed don’t even realise that they are not doing so.

“Highly rational people” includes faculty members on admission committees of prestigious medical schools.

A “Perspective” piece in a recent issue of the top-shelf medical journal The Lancet, by Donald Barr of Stanford, describes how admission committees usually place a heavy emphasis on strong performance in science subjects, supposedly because those who are good at science will become good doctors and vice versa.   Barr decided to examine the evidence for this presumed correlation.   What he found, roughly, is that the evidence pointed the other way: the better you perform in undergraduate sciences, the worse you are as a doctor.  (Of course you should read Barr’s article for a more detailed and nuanced summary of the evidence.)

In other words, faculty members on admission committees of medical schools – the kind of people who would think of themselves as highly rational, who would readily stress the importance of taking proper account of the scientific evidence in medical practice – these faculty members were basing their admission decisions on a belief that was unfounded, erroneous, and harmful to their profession!

Barr’s scathing commentary is worth quoting at length:

If what we seek from our students is professional excellence, we must be careful to base the manner in which we select these students on scientific evidence, not on superstition. Beyond its religious connotation, The Oxford English Dictionary suggests that superstition represents an, “irrational or unfounded belief”.  The belief that one’s knowledge of science represents a continuous metric of the quality of one’s preparation for the study of medicine represents precisely such an “unfounded belief”. There seems to be no scientific evidence to support it. Great physicians base their professional practice on a threshold of scientific knowledge they have acquired throughout their career. Upon this foundation they build an artistic display of communication, compassion, empathy, and judgment. In selecting students for the study of medicine, we must be careful to avoid superstition, and to adhere to the evidence that equates as metrics of quality a preparation in fundamental scientific principles and the non-cognitive characteristics that are conducive to professional greatness.

Note that these medical faculty members’ admissions decisions are in an obvious sense “evidence-based.”  Each applicant would have provided a dossier of information (grade transcripts, letters of recommendation, etc.) and the learned professors would have been taking careful note of that evidence at least.

However, their method of making decisions was not adequately evidence-based. In adopting their method they had implicitly made judgments not  about students themselves but about the criteria for selecting students; and those judgements were not properly based on evidence.

It might be helpful to distinguish first-order from second order evidence based decision.   A first-order evidence based decision is one which properly takes into account the evidence relevant to the decision at hand.  This may include the particular facts and circumstances of the case, as well as more general scientific information.  So for example in a clinical context, the doctor making a first-order evidence-based judgement as to treatment would consider the available information about the patient as well as scientific information about the effectiveness of the treatment being recommended.

Now, taking evidence into account properly implies the existence of some method for finding and evaluating evidence and incorporating it into the final judgement.

A decision is second-order evidence-based when the choice of method is itself properly evidence-based.  Somebody making a decision which is second-order evidence based is considering (or has duly considered) not only the evidence pertaining to the decision at hand, but the evidence pertaining to the choice of method for making that type of decision.

[In theory of course there would be third-order evidence-based decision (evidence-based decisions about how to decide what method to use in making a decision), and so on.]

Donald Barr can be seen as urging that the medical profession be (or be more) second-order evidence-based in their admission decisions.  His recommendation is not that they take scientific data into account in any particular decision.  It is, rather, that they allow scientific evidence to shape their general framework for making admission decisions.

The distinction between first- and second-order evidence based decision is somewhat subtle.  In my experience it can be difficult for people to understand the distinction and appreciate the importance of being second-order evidence-based.

One experience of this, as it happens, was also with medical types.

I was contacted by a doctor – call him Dr. Smith – who was on a state committee whose job it was to spend millions of taxpayer dollars on fancy new medical equipment.  The committee received many applications and it had to decide which of those most deserved funding.  As he described it, the committee was using a relatively lightweight, informal version of standard multi-criteria decision making (list your criteria, weight your criteria, rate each option on each criterion, etc.).  Why were they using this method?  Apparently because “it seemed like the right thing to do” and “that’s they way we do it”.

Dr. Smith was concerned that the way in which the committee was making these decisions was insufficiently rigorous and transparent.  In particular he was worried that the process was not reliable enough, in the sense of treating similar cases the same way.  There was too much room for the slip and slop of unanchored subjective judgments.  This despite the fact that their decisions took into account a large amount of information and even scientific evidence – such as data about the benefits of certain types of treatments and technologies.

However Dr. Smith was a lone voice.  He was trying to persuade the chair of the committee, and other committee members, and the secretariat supporting the committee, that they should spend at least some time thinking not just about who should get the money, but about how they decide who should get the money.

I was invited to speak to the chair and the secretariat.  In a web-conference, I explained that there were many possible ways to make decisions of the kind they had to make.  Indeed there is an academic niche concerned with this very issue, i.e., how to make allocation decisions.   These methods have been subjected to some degree of scientific study, it is clear that some are better than others for particular types of decisions.   I explained that if they were not using the best method, they would likely be mis-allocating resources.  (Which means, in this kind of case, not just wasting money but not saving as many people from sickness and death as they might have.)

Then came the really tricky part.  I tried to gently explain that evidence-based selection of a decision method requires expertise, and not the expertise that they have as doctors, professors of medicine, or even medical administrators.  Rather, it is the expertise of a decision theorist.   Simply put, just being smart and have a fancy medical appointment doesn’t make you qualified to decide how to decide who should get funded.

As you can imagine, the conversation was polite but it went nowhere.   I was trying to explain to senior medical professionals that their decisions could be more rigorously evidence based.  They prided themselves on making evidence-based decisions, and they were making evidence-based decisions, but these were only first-order evidence based.  I was trying to convince them to be second-order evidence based (though I didn’t use those words).  Further, I was suggesting that their expertise wasn’t the sort of expertise that was really needed for their decision to be properly second-order evidence-based.

I had the feeling that they could hear the words but didn’t really “get” what I was saying.  Last I heard, nothing had changed in the way they were making their decisions.

Read Full Post »

The central responsibility, for Boards and for individual Directors, is to make good decisions.  What can Directors do to improve their decision making ability?

First, it is important to understand that decision making is a complex cognitive skill.  It is not an innate talent that some people were granted at birth.  Nor is it something that simply builds up with lots of experience – though experience is certainly relevant.

Rather, decision making is a cultivated skill, or in other words a domain of acquired expertise.   So the question becomes – how can Directors acquire more of this expertise, particularly with regard to the specific kinds of decision challenges that come before Boards?

We can look to contemporary cognitive science for some insights here.  A sub-field of cognitive science addresses the problem of expertise.   Scientists in this area are basically asking “How do people get really good at something?”

The dominant answer that has emerged over the past few decades is not really all that surprising.  To get really good, you need to do lots of good practice.   To be the best, you need to do the most, best quality practice.  The main reason Tiger Woods has been the top golfer is that he practised more, and more effectively, than anyone else.

For Directors, this means that to enhance your decision expertise, you need lots of practice making Board-type decisions.   Note that it is quality practice you need, not necessarily experience.   A Sunday golfer can accumulate lots of experience over the years, but their game never really improves because they are not working “on” their game.   In fact, lots of experience can make it harder to improve skills, because it can entrench bad habits.

This is where Julie Garland Mclellan’s “Director’s Dilemma” newsletter, and her new book Dilemmas Dilemmas, are so valuable.   They present lots of realistic case studies of decision problems of the kind that Directors regularly confront.  They give lots of opportunities for Directors to work on their game.

However, simply working through the case studies – and reading the “model answers” provided – may not be enough.   What really matters is how you practice.  And here the key insight is the obvious point that if you’re not in some way changing the way you do things, then your expertise will not improve.   To really get the benefit of working through case studies, you need to be expanding and refining your skill set.

Directors’ decision challenges, as illustrated in Dilemmas Dilemmas, are typically complex deliberative problems.  That is, they involve clarifying the problem, recognising a range of options and perhaps suboptions,  understanding the advantages and disadvantages of those options, and weighing up them up.   The core skill is disentangling and evaluating a complex set of issues and arguments.  How can a Director come to do this better?

One approach is to exploit the visual.  We know, in general, that when confronted with complexity, the human mind performs better with suitable visualisations.  A simple example: to understand how all the streets, train lines, etc., in a modern city are laid out, we use diagrams such as the maps found in a road atlas or, these days, the displays on GPS devices.

What is true for roads is even more true for deliberative decisions, which are abstract and indefinitely complex.   The human mind can cope more easily when the options, pros and cons, arguments and detailed evidence are laid out in a visually attractive, easily scannable form.   And the process of laying out the problem in this form can help introduce clarity and rigour.   The overall result is better understanding of the problem, better evaluation of the options, and, on average, better choices being made.

This approach to decision making has been pioneered by Austhink, who have developed a software tool (bCisive) to support the process.

Austhink has teamed up with Julie Garland Mclellan to offer workshops combining her Board expertise and case studies with Austhink’s methods and tools.  The workshops are intended primarily for “emerging” Directors – those who have recently become Directors, or hope to become Directors, or hope to take their directorial activity to a higher level.  Our ambition is that these workshops would be one of the most effective things that Directors – at any level of experience or expertise – could do to enhance their decision making expertise.   And if individual Directors can improve their game, Board decision making in general should also improve.

More information about the workshops.

Read Full Post »