Or you can say that you derive Learning Objectives from the Performance Objectives. Systematically derive.
For an Enterprise Learning Context
Most of the time anyway. Most of the time the Performance to be impacted – at some I for some R – should be fairly clear. Clear enough.
But there are times in an Enterprise where learning stuff now will translate in performance later, that will become apparent later. Learning about new, bleeding edge technology and tools, out there on the horizon, sometimes need to be taught/learned so that people can figure out if/how to take advantage of them in the future – near-term future – or mid-term future – or long-term future.
In other Learning Contexts – Educational Learning Contexts and Personal Learning Contexts – the opposite may be true (not always!) – the end performance, the terminal performance, may not be so clear.
How Do You Develop Your Learning Objectives?
Too often I’ve seen them developed as a result of a loose brainstorming process. Most of the time what gets produced is incomplete, inaccurate and inappropriate – given the terminal performance that could and/or should be impacted.
Too often the design and then development efforts don’t start with the TESTS – Performance Tests preferred rather than Knowledge Tests for most situations – depending on what Content follows. And those Tests aren’t authentic. Multiple choice exercises/tests may or may not be authentic at all let alone authentic enough.
Level 4 assessment mechanisms’ content and methods should be crystal clear from the moment the Performance Modeling analysis effort’s dust settles. Before systematically deriving the enabling knowledge and skills.
Level 3 assessment mechanisms’ content and methods should be crystal clear from the moment the Performance Modeling analysis effort’s dust settles. Before systematically deriving the enabling knowledge and skills.
Level 2 assessment mechanisms’ content and methods should be crystal clear from the moment the analysis dust settles both the Performance Analysis for the Terminal Learning Objectives – and the enabling Knowledge/Skills Analysis for the Enabling Learning Objectives – or whatever/however you label your two-tier Learning Objectives.
Level 1 assessment mechanisms’ content and methods should be crystal clear BEFORE any analysis dust is raised – let alone settles. A standard smiles sheet approach need not be reflective of any specific content – but should reflect the deployment type – different smiles sheets for e-Learning versus Instructor-led Training – are all that is needed – IF the other components of post Learning evaluation are in place.
Should Versus Could
Should you always do Level 1? No.
Should you always do Level 4? Yes.
When should you do a Level 3? When the results of the Results assessment are below targets or expectations.
When should you do a Level 2? When the results of the Transfer assessment are below targets or expectations.
When should you really do a Level 1? When the results of the Mastery assessment are below targets or expectations.
Yes, you can always do a Level 1 to gather Voice of the Learner feedback – but research has shown that that does not correlate with Level 4 – which is what it’s all about in the first place! But if it makes everyone one feel good to have Customer Satisfaction data – then go ahead. Just don’t spend to much time or money on it – and don’t react to that Reaction data as if it really is the key – or a clue about what is happening at Level 4.
Which is what it’s all about.
A Couple of Resources
Two resources include these two books – two of the 6 in my PACT 6-Pack – reconfigured updates in 2011 from several prior, older books , past articles, columns and Blog Posts. Available as paperbacks and/or Kindles.
More more about these 2 books – or the entire PACT 6-Pack – please go here.
# # #