Debriefing Instructional Analysis and Design Team Meetings – Part 2 of 2

This continues from this earlier post – here.

The Debriefing Steps

When it’s time to debrief, debrief.  Make sure the team has had a recent break.  Offer them a five- or ten-minute break before starting the debrief.  Let them know that the debriefing may take 20 to 40 minutes.

This debriefing happens at the conclusion of a 3-4 day meeting – for both Analysis and again for Design. Note: sometimes these analysis and design efforts are combined – and the debriefing modified appropriately.

Note: a typical cycle time for both the Analysis Team Meeting and the Design Team Meeting is 3 days each – where we move quickly through the structured process – gathering consensus data at each step at “the level of depth” needed for the next downstream step. This is where many members of the team might struggle – they expect us to boil the ocean to get a cup of tea – and detail everything right here – whereas the process doesn’t do that up front – as many things found/uncovered won’t be deemed critical enough to warrant investments in training (being left to Un-Structured OJT) and we didn’t need the extreme depth/detail to enable the Customer and Stakeholders to make the business decisions as “what to continue with to address with investments” of effort, time and money – and what to drop right away – where they don’t see enough R of the I (as in ROI).

The debriefing is focused around these five questions.

  1. What percent of everything under the “sun and moon” did we capture in terms of our coverage of the outputs, tasks, and enabling knowledge and skills within our project’s scope?
  2. What percent of everything critical, and not just necessary, did we capture in terms of our coverage of the outputs, tasks, and enabling knowledge and skills within our project’s scope?
  3. What did you personally think of the product we produced?  The content of both the Performance Model charts and the Knowledge/Skill Matrix charts?
  4. What did you think of the process we employed to produce the Performance Model charts and the Knowledge/Skill Matrix charts?
  5. What do you see as the key issues going forward for our Project Steering Team to address?

I go to the flip chart and on a blank page I frame my first two questions so that everyone can read my words rather than try to remember what I said as we do a systematic round-robins.

Remember . . . always try to make it visible.

What % of “everything under the sun and moon” did we capture in terms of our coverage of the outputs, tasks, and enabling knowledge and skills within our project’s scope?

What % of “everything critical,” and not just necessary, did we capture in terms of our coverage of the outputs, tasks, and enabling knowledge and skills within our project’s scope?

I ask them to write their answers to these first two questions down on any piece of paper in front of them.  Too often I have sensed that members who did not write down their answers changed them as we went around the table asking for their numbers. Peer pressure among Master Performers – yes there too.

Here is the first example – not all of the examples below present all “all of the feedback” from the debrifing – as there is too much info in some – that would give the client away.

This first one was done in a CAD – Curriculum Architecture Design effort’s Design Team Meeting. This Design Team included only people who had also been on the Analysis Team.

Click on the graphic to enlarge and potentially copy…

Again, I ask them to write their answers down on a piece of paper in front of them.  Too often I have sensed that members who did not write down their answers changed them as we went around the table asking for their numbers – and that taints the data as far as the Master Performers in the room are concerned – and taints the entire effort – something to avoid after going to all of this effort.

Ah, again, group think/peer pressure at work – and not for good purposes – here I don’t want consensus – I want their real feelings like them or not – “let the feedback chips fall where they may” at this point in my process.  So now I ask everyone to write down their scores, and then I go around the room systematically and get each set of numbers – or I gather all of the paper and read them out myself – handing them back as I read them.  Otherwise I write them – as they call them out.

I also tell them in advance that no one will have to explain their numbers to anyone else in the group.  In fact I’ll cut off the discussion, because the point of this little exercise is to get the individual feelings of each group member out as to how well we did, not to arrive at some consensus percentages.

Once I’ve gone around the room gathering each set of numbers, I thank them for their inputs and feedback and then try to move quickly on to the next three questions: 3, 4 and 5 (above)

But often enough, someone will ask someone else to explain himself or herself regarding the numbers they gave.  Even if I try to control this I can’t always do so.  And sometimes the one individual who is being challenged or questioned wants to tell everyone why they feel the way they do.  I usually let them do so and let the dialogue go. Facilitating a structured process often requires judgment calls by the facilitator “all day long.”

I also know that if I listen real closely to what is being said, I just might learn something germane to my assignment of facilitating the team to produce the proscribed outputs.  It’s never too late, even at the debriefing stage, for my process (and me) to get corrective feedback.

This next example is from the Analysis Meeting in a CAD effort – where the Task data is often at the macro (versus micro) level. Micro Task detailing gets done in an MCD – Modular Curriculum Development effort – the PACT Processes version of ADDIE – first in the Design Phases – and again in the Development Phase.

More from the CAD Analysis Team effort.

Now in this next example – where the same group of Master Performers shifts roles from an Analysis Team into their roles in the CAD Design Team Meeting – after the analysis data produced earlier has been used to conduct an ETA – Existing T&D Assessment – for ReUse potential. And then all of that was reviewed in a Project Steering Team Gate Review Meeting (PST GRM).

If this conversation will give me insights to what the team and I have produced, great!  If this helps me figure out where the holes or burning issues are within the context of all of my organized data, great.  I win for I can now get it fixed sooner rather than later.

The goal of the team debriefing is to get their feelings out on the table so that you, the facilitator, know where they stand on the completeness and accuracy of what you collectively have produced; otherwise, how would you know?

At the end of most of my analysis and design meetings, I almost never have a real good, personal feeling for the total, overall accuracy and completeness of the products of my process facilitation and the team’s content contributions.

After all, I own the process, and they own the content.  I need to ask them about the content.  They would know.  I shouldn’t be expected to know.

Next, to get some words around those percentages, I write the next three questions on a flip chart page and post it for team reference as we conduct this next round.

  • What did you personally think of the product we produced?  The content of both the Performance Model charts and the Knowledge/Skill Matrix charts?  Or the content of the design?
  • What did you think of the process we employed to produce the Performance Model charts and the Knowledge/Skill Matrix charts?  Or the content of the design?
  • What do you see as the key issues going forward for our Project Steering Team to address?

I tell everyone that they can respond to all three questions, two, or one, or they may pass as they wish.  I also tell the group that I intend to go around the room systematically to give everyone a chance to have their say, without being cut off or distracted by others’ questions, challenges, agreements, etc.

And I tell them if they’d like to add or rephrase their captured quote, then they may do so after we’ve made the first round.

Graphic below…It’s at this point – ina Design Meeting – that most participants can now, finally “see” where all of that Analysis data ended up – and see the point for generating it.

Another CAD Analysis effort’s debriefing score and comments.

More from that same effort – but you might notice that for some we didn’t get granular enough – they know we don’t yet have the details needed to actually develop content – and that is by design – for we don;t want to get into Analysis Paralysis when some potential training might never see the light of day – and remain Informal Learning – what I’ve called U-OJT: Un-Structured OJT (on the job training).

More comments…

And more commentary…

I’m always willing to take a second or third pass and give everyone a chance to add to their feedback.  Usually it is not needed.  But I am less willing to let go of process control because I have been burned and therefore learned the hard way that going in a nonsystematic, mixed order can result in

  • Someone being unhappy because they were not adequately heard and represented by their captured comments
  • Taking two or three times longer to process this last meeting agenda step than really necessary
  • Missing the opportunity to gain some additional insights from the feedback from our team due to the time required overrunning the time allotted

This effort – below – was to do the analysis without any decision made as to exactly what would be done with the data.

And more analysis commentary…

More…

The final commentary by the room full of Master Performers…

When some see the level and amount of data detail being generated by the process – they begin to fear that Training will keep them from their real day jobs – and trap them in endless boring training – but we haven’t decided yet anything about length – or media and mode. They just get worried (an indictment of my clients I am afraid). Sales people are almost always like that – in my experience – see the data generted by the process as “overkill in the extreme.”.

In Summary

Debriefing in the PACT Processes is a twist of the old saw, “Tell them what you’re going to tell them, tell them, and then tell them what you told them.”

In a PACT analysis or design meeting, the debriefings are used to help those of us without extensive performance or content knowledge understand where we are in terms of our data or design’s completeness, accuracy, and appropriateness. And it builds confidence in the Master Performers (and other SMEs) that you might be working with – plus it boosts the confidence of the Project Steering Team as you progress through the process. Always a good thing.

Control what you can in the meeting.  Learn what you can.  And debrief in a flexible, yet structured manner so that you get what you are looking for, and not just what they happen to give you.

And be open for other debriefing means to these targeted ends: how good a job did we, the team and facilitator “do” in our limited time in generating data that is complete, accurate and appropriate.

Note: this approach might be better adapted than adopted. As always – it depends.

I share this with you so that you might make the decisions on that.

# # #

One comment on “Debriefing Instructional Analysis and Design Team Meetings – Part 2 of 2

  1. Pingback: Debriefing Instructional Analysis and Design Team Meetings – Part 1 of 2 | EPPIC - Pursuing Performance

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.