Reset? – Games In Learning

Uh oh. Houston – we have a problem…

…my learning game didn’t prepare me for this!

We have recent experimental evidence that narrative educational games lead to poorer learning and take longer to complete than simply displaying the lesson content in a slide presentation. 

I read this when it first came out – in an ASTD article – that someone in my online network brought to my attention. And it was no big deal – as it was what I had expected – it concluded what I already knew.

And – I had heard this “controversy” before.

But I reacted a bit differently when I saw an online (albeit, a somewhat mild) attack on Dr. Ruth C. Clark about her recent ASTD article: 

WHY GAMES DON’T TEACH

April 30, 2012 – by Ruth Colvin Clark

Read the Article above and the Comments and her replies.

I won’t pinpoint my source – one of a couple of attacks that I read online – I’m sure that you can find it – out there in cyberspace. It is currently in the “echo chamber” of RT and Re-Postings.

And maybe “attack” is too strong a word.

But Ruth has been a respected Practitioner bringing and speaking publicly about the Research – in Instruction/ Training/ Learning – for decades (and decades). So I really dislike her name being dragged through the cyber-mud by some whose credentials don’t hold the proverbial candle to her shining star – in the biz.

And – to expect her to dispute and disprove “every other study under the sun” that “proves” that Games Teach – as one criticism demands – is a bit much. Really. And is somewhat telling in itself.

I didn’t check them out myself – “all those other studies” – and who was behind them and what their methods were. And if they were a “vested” interested party or not. But I know that the studies Ruth would review and trust – went through that kind of scrutiny.

About Ruth Colvin Clark

Ruth Clark is a specialist in instructional design and technical training, determined to bridge the gap between academic research and practitioner application in instructional methods. She holds a doctorate in the field and is president of her own company, Clark Training & Consulting. Her books and articles focus on various aspects of training and e-learning.

My Next Step

So – as I have known Ruth for almost 30 years – not really well, but well enough that she answers my phone calls and emails – an NSPI/ISPI tradition – I wanted to do something.

For I also knew that this entire area of hype – Games in Instruction/Learning/Training and Gamification – is another one of those Myths that will take a while – if it can be killed off at all. It might not because as you can imagine there BIG BUCKS involved and it’s just so darn INTUITIVE. Think of Learning Styles (designing for, and not I have a preference for) and the use of MBTI for job selection for future success. They will probably never be beaten back into the bushes – as there are too many vested interests pushing their research with the big bucks that see a sale born every minute.

And again – it’s just so intuitive – isn’t it?

Like Turning Away From the Skid.

Intuitively seductive – or is that – seductively intuitive? Whatever. And – more importantly than intuitively – not correct.

But I digress.

So – I emailed some colleagues who I know know the Research – the Evidence for and against stuff in the Learning Space – for their response. I knew that they know Ruth. And her work. And her focus on the Evidence.

One thing I expected back was some words about “Popularity Is Not Evidence” – because that is often the reason for things being valid in some people’s minds and writing. It is not! But it is what guides many. Too many.

Voices of Wisdom From My Crowd

One respondent was Dr. Richard E. Clark – an APA Fellow – who researches the research at USC’s Center for Cognitive Technologies for other Research Institutes and organizations such as the U.S. Army – who wrote me:

I agree with Ruth but the True Believers in Games out there may never accept the evidence.

I’ve attached my own review and it may be more extensive than Ruth’s.  I’m also attaching some slides I used in a debate at AERA in 2011 on this topic.  Yet none of this evidence convinces the ideologically committed gamesters. They variously argue that “the research did not study a game, I define “game” differently” or “I’d need to see the game that was tested, it was probably poorly designed” or … it goes on and on.

What I find interesting is that in my review I checked all of the internal research conducted by the military (arguably the biggest buyer of “serious” or training games in the US) and their own research reports indicates that games are not as effective or as efficient at training than other less expensive options (I’ve cited all of the military reviews that are available).  Yet the military trainers manage to ignore their own studies.

There are some topics where so much money and ideology and personal expectations are at risk that reason and evidence is simply ignored.

This is why we need a commitment to EPB.

Regards,

Dick

Here is what APA Fellow Dick Clark sent me.

AERA_Games_Debate_Clark 3-20-2011-1 and SeriousGamesResearchET_June07

Dick can be reached at:

Dick Clark – Richard E. Clark, EdD
Center for Cognitive Technology – Rossier School of Education – University of Southern California
clark@usc.edu

And his Research Center – with many PDF papers, etc. – may be found at: www.cogtech.usc.edu

Another Level Another Voice

Jeanne Farrington, EdD wrote in her Performance Improvement Quarterly column – Volume 24, Number 2 / 2011:

From the Research: Myths Worth Dispelling – Seriously, the Game Is Up

Given the average cost of creating a serious simulation game, which can
start with low-fidelity games at $20,000 to $50,000 (Derryberry, 2008), but
easily reach $1 million or more (Clark, 2007; Derryberry, 2008, Sitzmann, in
press), those considering including simulation games in a training program
should have a strong rationale for doing so.

And…

Simulations, which are a hallmark of serious games (Tobias & Fletcher,
2007), provide practice outside the actual performance environment. Embedding
simulations in games can provide varying levels of fidelity to the
intended environment, making the practice more or less realistic.

Simulations are often used when the regular environment is not available,
or if the consequences of error are too high or too costly for people to
practice there. Serious games can also include a wider range of practice
options and can provide choices to learners during game play that might not
be practical in other settings (Aldrich, 2009). In addition, learners can be
afforded nearly unlimited opportunities for practice via serious games.
Although the study did not include controls for time on learning tasks,
Sitzmann (in press) found that those who had unlimited access to games
retained more than those whose practice time was limited.

And…

Although it may seem obvious that apparent similarity between a game
and real life is sufficient, it is not. The key to transfer with simulations is to
ensure that the cognitive processes learners use during game play are similar
to the tasks they are learning to perform (Tobias &Fletcher, 2007). The best
way to ensure this similarity of tasks is through cognitive task analysis (see, for
example, Chipman, Schraagen, & Shalin, 2000).

Dr. Farrington can be reached via her web site at: http://www.jfarrington.com/jeanne.html

It’s the quote in this last slide (above) from what Dick Clark sent me – that really connects with what Ruth was saying in her article…which was…

Despite the uncontested popularity of commercial games and a lot of hype in the training community, the reality is that there is scarce credible evidence on how and when to best use games to improve instructional outcomes and motivation. At this stage, I recommend games to implement drill and practice exercises for tasks that require immediate and accurate responses. Hopefully we will cultivate a more refined approach to categorize the features of games that best match various instructional goals, similar to the Bloom’s Taxonomy of learning objectives. If you are determined to gamify, I recommend testing a prototype version to evaluate its effectiveness and efficiency compared to a more traditional approach.

Let’s Not Play Games With Games in Learning

It doesn’t help when someone writes something along the lines of this (I changed it so don’t bother Googling it):

Reason #1: Per Neuro-Science – playing games causes the brain to release dopamine. And Dopamine creates Pleasure. And Pleasure is good for learning.

What!?!

Tell or suggest that – or the reverse – to a Marine after they’ve just finished up Boot Camp.

But – check with your insurance agent first.

The Bottom Line – Is The Bottom Line

Do Games teach? They can.

Learning objectives can be mastered – hopefully those objectives where authentic enough and not too generic/general. That they align very closely with the Performance Objectives – and don’t include a lot of extraneous things to learn/master that have nothing to do with terminal, on-the-job performance requirements – for that would simply provide for Cognitive Overload of the Learner – and that’s not a good thing.

And if they teach to the objectives they teach it both effectively AND efficiently – compared to other alternatives – or did it “have to be games” – and if so, “why” was that?

Was there a less costly or time consuming approach – and not just first costs – but the life cycle costs – that would include post development & release costs such as costs for administration and maintenance – over the entire life cycle?

That’s what a Serious Business professional is concerned about. Value over the life cycle. Not the cool factor.

And exactly what a Serious Games professional should be all about too.

Good stewardship of Shareholder Equity in the L&D space in not just a good practice.

It one of the key measures of your worth as a professional.

You’d think so if that was all your own money.

Treat the shareholders’ equity – the money – as if it were your own – and as if you were a serious business professional and not some fangirl or fanboy in love with this shiny game technology regardless of plain old business sense. That would be cool.

Use Games in Learning?

Just because you can – doesn’t mean you should.

Reset?

As always – it depends.

Lessons in Making Lemonade

# # #

8 comments on “Reset? – Games In Learning

  1. Pingback: Games Teach!

  2. There is a lot that can be said about this topic, most of which has already been shared. As others have said, one cannot dismiss a medium as ineffective, only the instructional design behind it. And while it is is SOMEWHAT true that you can design effective instruction using any medium (deep breath, this is where I run the risk or resurrecting the Clark-Kozma debate), some instructional methods and some instructional outcomes are more easily supported by some media than others.

    Few would argue that distance learning CAN be as or more effective than face-to-face learning, but the medium does not EASILY support social aspects of learning, discussion, and so forth, for example (yes, video conferencing is great, but you don’t get body language, facial expressions, etc. to the same extent as you do live, AND conferencing is more DIFFICULT than just being in the same room with someone).

    So we choose our media and mode of delivery in accordance with our outcomes, strategies, learners, and constraints. That goes for games too: I have always said that game-based learning should be reserved for the goals and outcomes that are not easily served by other means and when resources are sufficient (time being one of the keys).

    Having said that, there ARE some outcomes that are more easily addressed via simulations and games, chief among them being understanding complex systems and processes that, were they reduced to declarative knowledge rather than experienced and manipulated over time (with all the necessary instructional events like debriefing, feedback, guidance, etc.), would be much more DIFFICULT (not impossible) to achieve. In some case, so difficult that it makes those other media/modes IMPRACTICAL to consider.

    Chris Crawford, a well-respected game designer and author, shares an excellent example in his chapter Interactivity, Process, and Algorithm in a book I edited in 2010 (Interdisciplinary models and tools for serious games: Emerging concepts and future directions). Although his point is to argue for better programming of games and emotion and human interactions, he makes the point that process outcomes such as understanding WHY Napoleon lost the battle of Waterloo NOT as a set of facts and propositions, but as a process/system achieved through manipulating the battle components, changing perspectives, adding more troops, changing the timing, etc., and watching the results. Over time, through interactions with a complex system/simulation, one builds a complex, layered, systemic view of the reasons that COULD be reduced to a set of declarative knowledge/facts, but which would not be LEARNABLE from those facts. And yes, you COULD learn through means other than simulation, but it would take LONGER and be less practical. This is the same reason that many have advocated using Civilization and Sim City to understand history and geography from a systemic perspective (I have a student who developed a lesson plan for 8th grade geography that achieves outcomes like this that could not have otherwise been done within the constraints of her classroom).

    For those outcomes, and for cases where the resources support their use, simulations and games will always be “better” not because they are new ways of learning, but because they SUPPORT those outcomes and associated strategies more effectively.

    And by the way, to add another study to the mix of “games don’t teach,” I published an article in ETR&D in 2004 that showed a game I developed promoted transfer (something hard to achieve by existing means within the constraints of K12 education) and another in JCMST in 2006 that showed it promoted attitude toward mathematics. Both were, in my opinion, well-designed research (they earned me my PhD, in any case):

    Van Eck, R., & Dempsey, J. (2002). The effect of competition and contextualized advisement on the transfer of mathematics skills in a computer-based instructional simulation game. Educational Technology Research and Development, 50(3), 23–41.

    Van Eck, R. (2006).The effect of contextual pedagogical advisement and competition on middle-school students’ attitude toward mathematics and mathematics instruction using a computer-based simulation game. Journal of Computers in Mathematics and Science Teaching, 25(2), 165–195.

    Like

  3. Pingback: » Blog Archive » Weekly Bookmarks (5/20/12) » Education Is Everything

  4. Pingback: Weekly Bookmarks (5/20/12) « Experiencing E-Learning

  5. Hi Guy, I assume that my criticism of Clark is at least one of the ones that you feel is “dragging her name through the cyber-mud,” although I really was trying to attack her arguments and not make any ad hominem attacks against her personally. You’re correct that my credentials don’t come anywhere near the immense contributions that Ruth Clark has made in the field. However, attacking my credentials is an ad hominem attack, and implying that I have a vested interest (I don’t) is another one. I think it’s telling that you started your post with two logical fallacies rather than any of the multiple legitimate arguments that you put forth later in your post. Even experts can make mistakes; we can’t simply take them at their word because they have made great contributions in the past.

    Part of what frustrated me so much with Clark’s article is that I know that she knows how to handle and interpret research. She has this vast experience and influence in the field, and she chose to use it by reporting and applying a study in a way that the researchers themselves say it shouldn’t be used. Clark says this single study proves games shouldn’t be used for anything other than drill and practice, but the researchers themselves are quite circumspect about the limitations of their study (I received a copy of the original study after publishing my post):

    “These findings do not prove that all narrative discovery games are ineffective; rather, they show that the two well-designed narrative discovery games we used in this study were less effective than corresponding slideshows in promoting learning outcomes based on transfer and retention of the games’ academic content.”

    I think if we’re going to be serious learning professionals that we have to acknowledge the limitations of the research. We can’t be so sloppy as to think that any single study has all the answers. Sorry, but if Clark is going to claim that games never work (as she did in her title) or that they only work for drill and practice (as she did in her concluding paragraph), she does actually have to support those opinions. She can’t just ride on her credentials. She needs to explain why the research contradicting the single study cited isn’t valid or applicable.

    I agree with you that people don’t read research carefully and take the time to understand it, and there’s tons of misinformation out there as egregious as your games-dopamine-pleasure-learning example. I also agree with your final conclusion. Games aren’t always the right answer, and they are too expensive for most situations. You’re right that we need to be responsible stewards of money. Clark’s post wasn’t about primarily about the cost though; it was about games not being able to teach anything. As you said, games can teach, so Clark is mistaken when she says they can’t. She shouldn’t get a pass on that misstatement just because she has made past contributions to the field.

    Like

  6. Pingback: Games Teach! | Kapp Notes

  7. Hey, Guy!

    We agree on a lot of things. And I’m mostly in agreement with you here. But I’ll have to admit that I was a little taken aback by Ruth’s article. Two things that hit me were the title and the apparence that the narrative game (a gated mechanism that uses indirect methods to convey something that is easily communicated directly) used in the comparison is quite obviously inferior to the slide representation. One provides the concept immediately, the other obscures the concept behind silly mechanics. Overall, I agree with Ruth, but I also think there’s room for discourse and discussion in this space. No thinking is infallible. No voice more important than another.

    As a lifelong gamer, I can say that games do teach “something”, whether or not it was something valuable or intentional (i believe it’s not) is up for debate and yet unproven. I’ve got lots of useless stuff in longterm memory. Things I won’t forget because of the challenge, repetition and feedback offered by the mechanism. I also have some skills that are peripheral to the intent of the games from which they were acquired. Skills like group facilitation of complex activities at a distance. There are some really interesting parallels that are difficult to define if all you look at is the mechanism itself. Maybe we’re not asking the right questions in our research. Maybe our researchers don’t play games. Maybe this generates a bias from that camp. I only point this out because many of the “serious game” design folks I’ve talked to and many of the instructional design folks that criticize the value of games don’t actually play them. Maybe this doesn’t matter. Could be a factor in the rash of gamification failure.

    Plenty of the people I know that play games scoff at many of the attempts to gamify learning. I think it takes familiarity with something to make good judgments about it. I play games but I believe that for direct instructional support there are FAR more efficient methods than using games to convey facts and principles.

    Do “serious games” belong in instruction? Used as direct methods to teach concepts, I don’t think so. It’s a waste of energy. The effort to outome ratio just doesn’t balance.

    As implied, relevant work simulations can provide real skill progression through varied practice but even these will fail if poorly designed and executed. If you have confidence in your service provider, clear expectations for measurement, and the cost benefit jives – go for it. If not… leave it.

    Representation: Abstract Concrete
    Conveyance: Direct Indirect

    I think there are some correlaries we could draw using these axis that could predict how well a game *could* perform. This doesn’t imply comparison, necessarily, to alternate methods. Nor does it imply a ratio score for effort to outcome. Standalone performance, comparison to alternatives, and effort to outcome are important and I only see these referenced tangentially or partially in much of the research. One other potential shortfall of research is narrow focus on isolated intervention vice value to continuum.

    There is work to be done to draw distinctions between the application of game theory and games themselves. The value of game mechanics bolted onto a learning activity don’t seem to have been proven. I’d be careful to draw correlation between these apparent failures and the real learning potential of a well designed game.

    Like

    • Hi Steve! Again, thanks for commenting. About Ruth’s title – I have almost never had an Editor “not” take license and change the title submitted to something that fit their editorial theme for an issue (from back in the days of paper issues of magazines: Training Magazine, NSPI’s PIJ, ASTD’s Technical & Skills Journal) – and that was maddening for an author – as sometimes the title made no sense given the content – or it twisted the intention of the article – or they edited out key content – making us feel like we were idiots for penning such “stuff.” That was one of the reasons my partners and I developed a quarterly newsletter back in the mid 1980s, where we could publish our own stuff without others interfering. Who knows what happened to Ruth’s article. But stuff like that does happen. Sometimes without the author having any say at all.

      If the context is Enterprise Learning – then one should be wary of investing shareholder equity in things that cost more than other alternatives for the same result. That was one of my take aways from Ruth’s article. Worry about Cognitive Load is another. Do game elements distract or not from the targeted learning? To me that is key. What was the intent of the investment?

      Was the intent to entertain a lot and then teach a little? To entice recruits into the Enterprise and demonstrate how “hip” the Enterprise is? That could be a valid Enterprise goal – making a game one of the right choices. If it was to help the Learner become a Performer – and do that most effectively AND efficiently – then perhaps not – except in some of the obvious situations – Pilot-Training, etc., etc. where it is too dangerous to learn in some other manner. There are lots of those kinds of situations where it might be appropriate for certain target audiences. But there are more situations and target audiences where it is not really appropriate – IMO. Not from a Steward of Shareholder Equity point of view – the POV I personally subscribe to – and take.

      But as always – it depends.

      Cheers!

      Guy

      Like

Leave a reply to Christy Tucker Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.