Back in the Day
The old grist mill. I know it well. When you’ve been around the Training/ Learning/ Knowledge Management biz as long as I have (since 1979) – and around the professional affinity group known as NSPI /ISPI in particular – you might recognize the mill, the grinder as:
Show us your data! And – data is plural!
The folks back in the day at NSPI/ISPI were a tough crowd – as that old saying goes. That’s the grist mill, the grinder, the “chew ’em up and spit ’em out” kind of challenge from challengers that I am used to from back in the day. But it was “all good.”
Back in those days – back at the NSPI Conferences (now ISPI) and at the local chapter meetings, and after the publications of both were sent out – one would hear that directly, indirectly or in other words. Where is your data? Why do you think that?
Other words described the goal of such challenges. Words such as “Measured Results” or “Research-based.”
The demand to “Prove it!” would echo around the halls, rooms and pages – and those who ventured into those sanctums – free from Foo Foo – either knew what they were in for and were prepared – or they were the next time they ventured into the grinder – if they dared. They left their chaff elsewhere and brought pure grist for the mill. They learned – to be prepare to to avoid a place where challenges – in an academic tradition from those with an academic traditional/approach to data, research and being challenged by their peers – and modeled that for the rest of us – was a good thing. An appropriate thing.
It was sometimes just too intimidating for some. They chose to not share their opinions – when they didn’t have some data to back that data up. Their voices, their opinions were not heard or read – often. Perhaps that is a good thing. An approach to QA – Quality Assurance. In fact – it was a good thing.
The Wisdom of the Crowd was then less crowded, more limited to those vocalizing data and their data-based opinions – as there is always interpretation of the data. They brought hard data and softer data. But data – beyond an “N of 1” – especially if it was an “N of 1” and the “N” was the speaker/author. Could it be replicated was another challenge?
Today – less challenges. More acceptance in general of claims, of “I think.” More Foo Foo.
Today – “I think” has replaced “the research data says” in too many online posts and exchanges – and on paper.
I personally think (wink wink) that there needs to be an extended, examination and evaluation and eventual emphasized effort to expose and extinguish all of the non-evidence-based “e” euphoria. Would you agree? We could call it “The Eleven Es” – but I’m only joking – a little. And I bet I made a good many of you go back and count the e’s.
Without going into all of the THINGS I think should be examined and exposed – many of those are listed here on this web site – I think/believe/feel strongly that we the members of a Crowd?Network – and our extended Crowds/Networks – should help our professional sisters and brothers in helping them to better help their clients and stakeholders – with evidence-based practice.
We need to help others separate the wheat from the chaff – the grist from the chaff.
What does the evidence (data) say? And what does it not say (to be fair)?
And where is the data lacking – and perhaps good subjects for research efforts?
Shouldn’t someone – create a list of the Foo Foo – stuff dis-proven yet still lingering on the periphery and sometimes at center stage? Whose responsibility is it to shine the light on this? To bring the Foo Foo from the shadows and help others avoid it – or to challenge it when it is at center stage?
What are your thoughts on this…
- Yes – No?
- What grist would you have for the grinder – the mill – after it’s been separated from the chaff – via the evidence from research?
My experiences – since 1982 – is that most clients don’t like sharing data with their consultants – about the successful results of their efforts. A few do. But most see it as proprietary data. Some of those shared successes of mine can be found here – in my Client Case Studies.
One success in particular was at Bank of America – in 1997 – for an effort to combine multiple Bank Curricula – from mergers – into one comprehensive set of T&D Paths for Tellers, Financial Relationship Managers, the Bank Assistant Manager, the Bank Manger and the District Manager – all in one CAD – Curriculum Architecture Design project – with one Analysis Team Meeting – and one Design Team Meeting – producing 5 T&D Paths – where we salvaged a lot of T&D for reuse “as is” – and created Event Specifications for all of the gaps.
My client – was happy enough to engage me again – and then some more at a another firm. But he didn’t tell me about the specific results from that first effort – until over 10 years later – when he wrote this Recommendation for me on LinkedIn:
“Guy is a true instructional design and performance improvement professional, author and practitioner. While I was working at Bank of America we commissioned Guy and his CADDI team to redesign the three retail bank learning and development programs into one high performing curriculum design. As a result of this work we were able to reduce turnover at the frontline teller positions by an average of 30%. Guy’s ethics and proven approach made the effort very cost effective and fast to implement. I would recommend Guy for his knowledge of human performance technology, for his client service focus and for his business ethics. Randy Kohout VP, CUNA Mutual Group” March 22, 2009
The results: reducing the turnover at the frontline teller positions by an average of 30%.
Now that’s some data! I know many of my client’s had similar results – would talk about them to me directly – sometimes with very hard data – sometimes with their suspicions – but few have shared specifics such as these.
Do your clients (internal or external) share such data? Do they measure results? Do they measure meaningful results? Will they allow you to share that?
# # #